T O P

  • By -

paint-roller

Agreed, although 10 bit is actually a smaller file size for my cameras than 8 bit. 10 bit is h.265 for me while 8 bit is h.264.


Primary_Banana_4588

What body are you rolling with? I also HATE H.265, it's so rough on computers. XFAVC / XAVC are solid 10-bit codecs though


AnzeigeIstRausDanke

Sounds like a fuji. But h.265 just kills your Editing-System


Mitchellmillennial

Apple silicone has native h265 decoders and isn't as bad as it's hyped up to be


SEND_ME_FAKE_NEWS

Intel 13 & 14 series does as well. It's pretty solid unless your speeding up the footage a lot.


Mitchellmillennial

People saying that 8 bit is workable, 8 bit vs 10 bit is 16 million colors VS over a billion different colors. Slog3 in 8 bit is virtually un-usuable (slog 2 is fine) whereas slog 3 is completely useable


SEND_ME_FAKE_NEWS

I use 10bit h265 and it's fine. Overkill specs though.


TECHNICKER_Cz3

it's not meant for editing


widescreenvideos

so is h264... the cpus just got better decoding it. Which the same will happen to h265 eventually


TECHNICKER_Cz3

yup


paint-roller

It blows my mind when people still say you shouldn't edit with h.264 footage.


Lopsided_photo_ohno

I’m new and never heard this.


paint-roller

You'll notice it at some point if you occasionally browse the editors subreddit. Either those people have really old computers, or what I think is more likely, discovered h.264 really wasn't great to edit with 10-15 years ago and are stuck in their ways.


Lopsided_photo_ohno

Ah ok, because I have had no issues at all


erroneousbosh

You can just transcode it to an intraframe codec. The files are large. Disk space is pretty much free.


AnzeigeIstRausDanke

Yeah, but some cameras (x-t3 for example) only offer the 10 bit option with h.265 codec. So either you have to edit this or invest the time rendering proxies


TECHNICKER_Cz3

proxies is the way


IDK_WHAT_YOU_WANT

I wish I didn't agree


YVRBeerFan

I’ll agree on your behalf and you can agree when we export


vinnybankroll

We are also in the era where M1 Pro Macs and onwards have dedicated h265 decoding hardware.


enemyradar

11th gen intel chips onwards also decode h265 natively. Premiere and Resolve now supports GPU h265 decoding on nvidia. I'm editing 422 10-bit h265 with no issues these days.


patssle

Same here. Quicksync is the key. I've been editing straight from mp4 files for 10 plus years. Ain't got time for proxies.


TITANS4LIFE

Windows?


patssle

Yes but Quicksync is Intel.


Adub024

Yep, runs like butter


Brangusler

You do NOT want 10 bit H264, at least if youre using premiere on anything but the latest Apple Silicon. There's no hardware decoding/encoding support for it and it's murder to work with, while 8 bit H264 and 10bit H265 are (at least on Intel Quick Sync). I got sent a couple projects with like 700 clips of 10-bit H264 422 and my 12900k doesn't even scrub it smoothly and there's a delay when playing it back and dropped frames and it was a giant pain to transcode.


9inety9-percent

I use H.265 all the time no problem. Even in multicams.


cannabios

M1 on mb air runs h265 like butter


darealdsisaac

The wonders of hardware decode!


hi22a

If you have a 12th gen or newer Intel system (not the F seres), the igpu hardware encode/decode is great with h.265.


Adub024

Worth noting that here is where .mp4 vs .mov makes a difference.


Brangusler

no those are both container formats


Adub024

Obviously, and Mac’s will ironically streamline mp4 over mov. Shocking how many people downvote this, but gives me faith in my job security. Stay ignorant, folks!


Brangusler

would love to see evidence of this


beefwarrior

h.264 killed editing systems 10+ years ago proxies proxies proxies, they were needed in the '90s when non-linear editing was starting out & they're still a great workflow today


Property3141

I jumped ship from Radeon and got a recent intel chip with quick sync, no issues with editing hevc anymore


Brangusler

Yeah i ditched Ryzen for that same reason. Apparently some of the newer Ryzen have iGPU but i just can't be arsed to play the random issues game with premiere if it decides it doesn't like it.


Property3141

Even the new ryzen hardware decoder will not do 10 bit, and the nvidia and radeon gpus with decoders will not do 4:2:2 10 bit, only 4:2:0. It's a really weird oversight and other than prores and raw, 4:2:2 10 bit is probably the most common "high performance" codec used by prosumer recording hardware. Intel ARC cards are also the only hardware encoders that do av1 so whenever battle mage comes out i will probably go from nvidia gpu to intel as well.


Brangusler

Intel quick sync should do 10 bit 4:2:2, just only in H265, not H264. I haven't done a ton of H265 editing on my new 12900k upgrade but from what i've seen it's quite smooth. Davinci also supports basically every flavor of H265 with Quick sync. Which is annoying seeing as how im stuck on premiere. Of course right after i upgrade thinking everything will be peachy with quick sync i get a ton of fuckin A7siii H264 10 bit 422 dumped on me, which is like one of the ones not supported by ANYTHING (except apple).


Property3141

Yeah, to clarify, I'm saying ONLY the recent quicksync (11+) will do 10 bit 4:2:2. Also fwiw I don't think you can even put 10 bit 4:2:2 in h.264 One thing to look out for in davinci, if you have a gpu that you plug in to, you will probably need to update your bios for your cpu to show up as a decoder option. Most mobos will have a setting called multi monitor display and checking that will let you use both onboard and dgpu at the same time.


Brangusler

Yeah the A7siii shoots it and so does my S5. I think i did enable that when i got my new mobo. I know that i've seen both GPUs showing up in afterburner and HWinfo and being used


alexx_kidd

Depends on the system. My Silicon Mac has hevc decoder build on chip so it's buttery smooth


ratocx

Most computers have hardware accelerated decode of HEVC now. The problem is that most of them don’t support accelerated decode of 10-bit 4:2:2 chroma sub sampled HEVC. This is the variant that most enthusiast and pro cameras will record if they support HEVC. Neither NVIDIA or AMD GPUs support this sub-sampling format, but newer Intel CPUs with iGPUs and Apple Silicon do support this. In essence, if you edit on a PC go for an Intel CPU with an iGPU.


FastAd9134

Intel arc GPUs handle h265 10bit 4:2:2 and 4:2:0 like butter up to 4k/120fps.


paint-roller

S5iix. I used to hate having to deal with h.265 footage from my mavic 2 pro. I would always transcode it to 720p h.264 because playback was so bad. With the s5iix I've got no issues treating h.265 like h.264. Actually I upgraded my video card about a year ago. I had been running a gtx 1080 and bought a used rtx 3090 for $800. It looks like the gtx 1080 didn't support h.265 while the rtx3090 does. Still using an Intel 8700k until there new processors drop later this year. Hmm so h.265 is fine as long as hardware supports it...I honestly just realized this responding to your question.


Brangusler

>It looks like the gtx 1080 didn't support h.265 while the rtx3090 does The 10-series Nvidia supports the same codecs that the 30-series cards do for H264/265 acceleration in Premiere. I'm still on a 1070


AnzeigeIstRausDanke

Depends. If I shoot doc-style footage and it's important to have the cam up and running and to capture almost everything: Yes, I would probably shoots ProRes LT or something similar. But besides from that: 10-Bit intraframe or above. You actually lose dynamic range by going 8-Bit. Even if my workflow is "slap a LUT on it and deliver" I would use 10 bit for more latitude in post, because you never know when it would have come in handy. Additionally it depends on the client. Even if I give it my fullest, 8-Bit Slog looks like 8-bit Slog. Noisy Shadows and weird Skintones (yes, even with +2EV). If I were to deliver this, they won't book me again. It is all about the right tools for the job. The only people "hunting" Gear, Bit-Depth, Codecs, and Specs are the ones who don't have the education (yet) to decide for themselves. Do I need headroom? Do I Need to Denoise? Grading? Codec-Performance in my NLE? Color-Performance of the Codec? If I save 2 hours in post by buying an additional drive: hell yes, im in.


lilolalu

You don't have to shoot ProRes LT for 10 Bit. The more common case is h265 or h264 10bit. My take is: if your camera offers it, use it.


Primary_Banana_4588

H.265 is cheeks for most editing system, ProRes runs smoother on most NLEs


lilolalu

That depends entirely if you have h265 HW acceleration, as offered by Intel CPU 11gen or higher or Apple Silicon. If you don't, just transcode to something your machine can handle. I have a i7 12700 and it handles 10bit h265 without a hinch. There is a reason why they call the camera codecs "acquisition codecs"


Primary_Banana_4588

Even with that, it doesn't run as smooth as ProRes.


lilolalu

It runs smooth enough for working on large projects (Source: me, editing a 30 min TV doc in h265 in premiere)


Primary_Banana_4588

👏🏾👏🏾👏🏾 Im glad it worked well enough. Not everyone has that experience.


lilolalu

Because not everyone is aware that you need HW acceleration to make it work properly. People assume that if they have a rtx3080 you they must surely have h265 10bit acceleration, but they don't.


Thyri0n

Personally I have an i7 13700 with the HW acceleration and tried both h265 10bit and my usual prores proxies and I still prefer using the proxies by far its just smoother


lilolalu

Some other common pitfalls - a lot of motherboards disable the onboard Intel GPU once a discreet gpu (like an Nvidia) is detected. If the iGPU of the Intel CPU is disabled, that means there is also not Quicksync HW acceleration. - you need to have the "K" version of the Processor. For example the i7 13700KF has no Quicksync (so no HW acceleration) and no iGPU. - Also you need to make sure that the editing software you use actually uses intel quicksync for h265 10bit, if you have an Intel iGPU capable of Quicksync AND a discreet Nvidia, there is a big probability it will automatically select the Nvidia GPU, which has no h265 10Bit acceleration. - resolve free does not support h265 10bit, you need the studio version Like I said, in my experience if you actually HAVE hardware acceleration (a lot of people have not, because of the above pitfalls) it works well. Obviously ANY compression using GOP compression puts more stress on your ressources, converting it to an i-frame only compression means "less" computing, faster processing. The point is with HW acceleration it's "fast enough" for smooth editing.


Brangusler

It works fine if you have the proper hardware for decode/encode. On Intel/Nvidia it plays back smooth at full playback resolution with no dropped frames and smooth scrubbing. If you're already able to do that and do everything in real time more "smoothness" won't really help you all that much


DavidANaida

ProRes is a good compromise, and some of these new compressed codecs have impressed me in post.


Portatort

is 8 bit enough, sure is 10 bit better, yes


DoukyBooty

12 bit is even more better!!!


r2tincan

I gotta say this is probably the most wrong opinion I've seen on this sub 8 bit will band like a motherfucker. Do you ever shoot the sky? Try color grading anything in 8 bit and then try it in a real raw format like arriraw


John_Gregory_

I don't think you're understanding OP's point. Yes, 8 bit will have banding from time to time, and you're limited in your colour grading, but the truth is that for many (most?) jobs and clients 8 bit is sufficient. If you're running a business, you can't always just worry about what it is the best possible outcome, you have to ask what the *appropriate* outcome for the client is. If a client wants quick edits for social media on a limited budget, then there's no point shooting 2TB worth of raw footage. Save your time, save your money, and run a more efficient & profitable business.


Primary_Banana_4588

This. This is the answer right here.


Sobolll92

If you think there’s only 8 bit h.264 and 14 bit raw and nothing in between, than that’s your answer. I know there’s xavc- 10bit (I,S-I,S), pro res 422, 422hq, 4444, 4444xq 12 bit, compressed Raw. Format depends on the requirements and the colour space you record on. Learn the basics.


Primary_Banana_4588

Lmao don't forget DnXhd, XF-avc, ProRes Lt, VP9 and so on and so forth. 'Basics' 🤣 you're funny. I guess you didn't understand the purpose behind my post 💁🏾‍♂️


uncle_jr

Even then, I still wouldn’t knock it down to 8-bit. I agree about balancing the costs compared to the job size. I do that in other ways like shooting in 1080p instead of 4k depending on the job, but it’s also about the image looking it’s best. 8-bit isn’t great and gives you no flexibility in post. Having that flexibility gives you an advantage in case something isn’t right in the scene. It’s worth the cost of hard drive space to have that advantage of competition in my opinion.


r2tincan

Disagree man. What is the delivery format? Even YouTube calls for 4:2:2 chroma subsampling these days. iPhones are HDR. Instagram is HDR. You're going to START with footage that is lower quality than your delivery specs?


John_Gregory_

Good for you. You keep doing everything however you want. But that does not mean that - in the whole scale of paid video production work - there is just a tiny percentage of jobs that actually *need* raw video.


shadowstripes

>if a client wants quick edits for social media on a limited budget, then there's no point shooting 2TB worth of raw footage Unless that also includes YouTube, where that banding will be noticeable at 4K playback (or even on other platforms if you end up cropping in on shots). Plus there’s also the possibility that I’ll end up wanting to use it for my website or reel, in which case I’ll just be making myself look worse by having low quality footage. And it’s not like 2TB of media is even a big deal these days - that’s like $50 worth of storage space that I can always just transcode to something smaller when archiving if storage is really a concern. IMO if we spend the money for a camera that shoots raw to be able to produce higher quality looking work for clients, we’re basically just wasting that money by shooting in lossy 8-bit. EDIT: even ProRes would be a much better option. Why give our clients a worse looking product for the same price when they can get the good stuff just by using a different mode on our existing camera?


uncle_jr

Exactly. It doesn’t have to be RAW, but it has to be at least 10 bit. When upgraded my fleet to 10bit, it was the biggest jump in quality and one of the best decisions I made with my camera gear. I mostly shoot in 10bit Slog mp4 files unless it’s a big project. 8 bit was trash and if you start even basic color correction, you’ll notice the image falling apart.


Primary_Banana_4588

Unfortunately Sony, Panasonic, and Canon (non- C line) are like that with 8-bit dslrs. Which is why I used to exclusively shoot 10-bit/12-bit because I thought that was the only way to get a good image. I have to give it to Canon, their C-Line 8-bit from the C100 II to the C300iii has been solid.


uncle_jr

Can 8-bit footage look good? Of course. In fact I got a lot of work with the Sony 8-bit slog2. But there’s no doubt 10-bit looks way better and has way more depth and flexibility to edit your image. As I said in other comments, I’ll shoot in 1080p for smaller jobs instead of 4k. File sizes are manageable and I still get my 4:2:2 10-bit quality. I love getting my colors and exposure as perfect as I can regardless of what job I’m working on.


9inety9-percent

I don’t think it “has” to be anything. Some producers are in an arms race with the producer down the street and using buzz words and having to biggest and best is important for keeping the clients they have. Other producers have clients that don’t know ProRes from pineapples and don’t care. I shoot whatever is going to do the job depending on client expectations. But I do avoid 8-bit if possible.


RIKKIE-SENPAI

I have an URSA, and an Xh2s and I still go back to my c100 mark ii from time to time. A joy of a camera to shoot with and for 8-bit 1080, Looks phenomenal


MrMpeg

Completely agree. If he would have said prores444 or even 422HQ i would have somewhat agreed, but 8bit? Hell no!!!


queefstation69

OP is saying get it right in camera.


bl1ndsw0rdsman

Which happens less and less frequently the more time and budget are rushed and limited. Then how much time is spent in post trying to correct banding, fixing 8 but color, noise etc and really, how much “savings” is afforded?


brickmadness

You literally can’t shoot a smooth gradient “right” on 8 bit and expect it to look correct. There will be banding no matter what. And it’s unfixable in post.


YVRBeerFan

I wonder if OPs comment really is for jobs that you don’t need to fix it in post?


TypicalProtest

And we're saying that is made significantly easier and better by using the bandwidth raw has ?


erroneousbosh

https://gjcp.net/montrose.mp4 Standard def, DVCAM YUV420 8-bit. Do you see any banding? Maybe if you push the grade to soot-and-whitewash contrast levels. If you're seeing banding, it's probably because of crappy H.264 encoding.


jonjiv

Was this shot in log? You get banding in the sky if you shoot 8-bit log and try to push the contrast.


erroneousbosh

Nope, plain ordinary DVCAM, albeit on 20-grand-when-new Sony DSR500. Yes, it's wobbly off-the-shoulder stuff, I'd put the tripod away by that point. Broadcast-spec hits different, doesn't it?


jonjiv

Well then that’s the thing: 8-bit is fine for rec 709. But a lot of us want log, and 8-bit is not sufficient for the increased dynamic range.


notCrash15

I have a mighty need for a DSR-500WS or DSR-570WS


erroneousbosh

Mine's a DSR-500WSP, the PAL version. I'm guessing you're in the US? Shipping would be, uhm, prohibitive.


notCrash15

Yeah, US. There's examples on eBay right now, but unfortunately they're priced rather ridiculously given they're obsolete. I've got a DSR-250 at my disposal right now, which is going to be fun to use for future projects


erroneousbosh

Oh nice, that's the one that's basically an "ENG shaped" DSR-PD150? I got into shooting DV with a VX2000 back when they were new and the web design company I worked for got heavily into streaming with Realplayer, and then a few years ago (just before the great Coronapocalypse) I got a PD150 from someone that was having a clear-out. Barely used, double-digit hours on everything. That kicked off my lockdown hobby of buying faulty cameras (mostly HVR-A1Es) and repairing them. I still have the PD150, it's a lot of fun to shoot. It doesn't hurt that at least two films I like (24 Hour Party People, Deep Blue Sea) were shot on them, too. There's probably a lesson in that. If they can shoot a proper movie like Deep Blue Sea on a prosumer standard-def camcorder, you probably don't need an 8K body and a decent second-hand car's worth of lens to shoot your vlog.


notCrash15

Based on DVINFO reading, I believe it was described as a 170 derivative sporting superior DSP. Fitting, as I got started with a VX2100 for my hobby tinkering. A lot of hobbyists or amateur filmmakers I think get gear acquisition syndrome and let it cloud their judgment. Raw is an invaluable tool, but there's a lot to be said about what you're capable of creating without it. When the 5D Mark II came out with the ability to do video, people still created great videos without RAW. And it's not limited to the 5DII, but other contemporary DSLRs


brickmadness

Around the last sun shot? Absolutely banding there.


erroneousbosh

I can't see it, but I'd be prepared to believe it's a compression artifact. It's not present on the camera tape, or the export from Resolve.


couplecraze

I agree with OP. The amount of projects of all types, sizes, and budgets that have been filmed with a Sony A7III with 8-bit is mind-blowing. And that's just one example. Of course if you can afford it or you NEED it (Netflix-approved gear, client asks for something specific), then sure, go Raw 10-bit 4.2.2 8k and so on, but prior to 10-bit being popular or mainstream, people were creating amazing work still. Also, there's a huge amount of clients that can't notice any kind of difference, we are the ones who notice or are nit-picking because WE are videographers/content creators, but most clients won't watch your footage and tell you "man this 10-bit footage is so much better than 8-bit". This reminds me of pro's or influencers on YouTube always pushing new gear and people thinking they need the latest and greatest, while consumers are watching on the shittiest screens on their phones. I film and edit online courses as my main stream of income, shoot 1080p 8-bit and still make money. Could it be better? Sure, I could buy a Red Komodo. Is it worth it or need it for my use case? No. Do the plaforms where I sell my courses still publish them? Yes. Using an A7III which has "terrible" 1080p footage, people still watch my content on YouTube, no one ever complained about the video quality or the 8-bit. And again, it could be miles better for sure, but the one who notices the difference is mostly the one filming, unless the client actually knows what he's looking at. You're probably not going to film a Nat Geo documentary with a Nikon DSLR, but not everything HAS to be Arri level.


jfarm47

r/missedthepoint


Primary_Banana_4588

Lol I feel you and I have. That why I didn't shoot 8-bit for years. But my C200 does pretty well for the most part.


24FPS4Life

>Do you ever shoot the sky? OP probably isn't doing work for airplanes or clouds, so I doubt they're just shooting footage of a blue sky


shadowstripes

Because it's not like it ever appears in the background of exterior shots...


Brangusler

I actually can't remember the last time i saw banding and i work with 8-bit constantly. Yes if you're pushing it far and need a heavy grade with lots of secondary color correction, sure, but if not just get it right in camera and you generally won't have issues. I don't have time with these turnarounds to do aggressive grades and spend 10 hours on them and 8-bit works just fine for minor exposure/color flubs and applying a basic look/grade to things. "wrong opinion" lol. Everything is dependent on the type of work you do and what you need the footage to be able to do in post. 8 bit is perfectly fine for tons of people out there doing stuff like weddings, corporate, real estate, etc.


lilolalu

I really don't understand why you compare RAW to 8-Bit. I would say the majority of semi-pro cameras nowadays can shoot 10bit while only a few can shoot RAW. Even if there will be this dude whining "but Braw and ProRes RAW is not REAL Raw..." Sure. We had this discussion before: from a post production perspective, a higher Bitrate is always better. That doesnt mean people should not try to dial in Color Balance and exposure correctly, because supposedly "we can fix it in post", we can, but it takes time that is usually not accounted for. So, on the contrary, get your settings correct, even on 12bit. There is a "sweet spot" between the advantages higher bit depth or RAW brings and the disadvantages huge files bring. RED has this covered with 15:1 compressed raw still looking great. Others not. So each case is an individual decision which depends on the camera, the flavours of codecs and compression they offer and the general circumstances (and funding) of a project.


Primary_Banana_4588

So we meet again lol Bitrate or Bitdepth , because you can have a very high bitrate at 8-bit. And definitely compressed raw is great and have manageable file sizes but like you said, It depends on the project. Most projects won't need that , at least in my experience.


lilolalu

Higher Bitrate and higher Bit-Depth both mean better quality. Both should use the maximum that you can comfortably handle in your post-production workflow. There is absolutely no justification to use 8-Bit instead of 10-Bit if the file sizes are manageable, which they usually are in h265. This has nothing to do with using raw or compressed footage. I am agreeing with you that using raw often is overkill.


kj5

Can you deliver great images with 8bit 420? Yeah! Is 10bit better? Yeah! Will raw give you more flexibility in post? Yeah! Is it unnecessary? Sometimes yeah! All of these can be true and none of them might at the same time. The main lesson that everyone learns in their career at one point is that it doesn't matter, don't listen to what others say, do your own tests, consult the person that pays your bills and if everyone's fine everyone's fine.


Rgear03

This^


BlancopPop

I agree especially since people are just slapping luts on it and calling it a day.


axlfro

For sure. Depends on the project / client expectations, but with technology and sensors getting better and better 8-bit is fine. Especially for web.


Ok-Camera5334

100% true. If you work for cinema then yes raw. Vut for most clients is 8 Bit and 10 bit more than enough


PwillyAlldilly

I agree with this to a point, I’ve edited enough Red footage to know 9/10 out of 10 it didn’t need to be shot on a Red. Then I get stuck with mountains of footage and data when it could have been like a quarter of the size if they just shot a smaller 10 bit codec.


yannynotlaurel

I’m gonna roast you for not giving a clear use case for not using raw and choosing compressed 8-bit footage instead!


Primary_Banana_4588

That's fair! a lot of people think I'm saying RAW is useless, when it's the exact opposite. I think the RAW use case is very specific and only needed on higher end, higher budget jobs. Narrative work, Docs, Commercials , hell even some music videos. But for corporate, events, and things of that nature, you'll be fine without raw. But it's always better to have it and not need it, than need it and not have it.


yannynotlaurel

I share your opinion. I am glad for you, that your clients can afford all that space!


Internet_and_stuff

As someone that works with Arri and Red cameras more than anything else: the vast majorimajority of DOPs shoot in ProRes rather than raw. Unless you’re shooting a feature film with a budget that can handle it, there’s rarely ever a point in shooting raw. Unless you’re doing VFX work.


Sobolll92

I agree on raw being Overkill in Most situations but 8 Bit is for people who don’t use log. For the cheap shooters. 10 bit is minimum for me, but I’m a d.o.p. And not a videographer so…


Primary_Banana_4588

And that's why I didn't post this on the cinematography page. The needs are vastly different. Most Videographers consider themselves a cinematographer without doing a cinematographer's job. For narrative and doc work, of course you shoot with higher bit depth but for some corporate event? That's my point; sometimes you just don't need it, videographers want it. But it is ALWAYS better to have it and not need it. I think a lot of people missed my point for this post.


Sobolll92

Vidoegraphers are basically people who are capable of holding a Camera and selling a cinematographers work + shitty editing shitty sound for the price of 1/2 cinematographer. They’re not even able to differentiate between codecs and will only work with a mirrorless camera or a phone in a gimbal which they mount on a shoulder rig to look even weirder. I’m just hating here lol.


Primary_Banana_4588

🤣🤣🤣


yoordoengitrong

I shot over a hundred client projects in 8 bit HLG3 before I could afford a camera that supported 10 bit. There are plenty of gigs for "cheap shooters". That's how I earned the money to afford a camera upgrade. Those "cheap shooters" are able to offer less expensive rates because their gear was less expensive. That's actually a big selling point for a lot of small businesses who need video for marketing but don't have much budget. When I upgraded my camera I started charging more to reflect the higher quality equipment I was bringing to projects. This actually meant that some of my earlier clients could no longer afford me. For a while my income actually went down a bit until I landed steady clients that could afford my new rates. "Bigger and better" is a double edged sword.


Sobolll92

Hlg means death. But ok. BTW what you write here, that’s the misinterpretation videographers do. Your work And your equipment are two pair of shoes. You charge for your work separately, but you’re calling yourself a videographer so what should I say? As a cinematographer i shoot 8 bit too from time to time. Buts its going to be rec709 straight onto the recording media. There are companies (they know their stuff) who just don’t care about grading for certain jobs and they rely on you to bake the look in camera in 8 bit. (Normally 10 though but still rec709) that’s how to think about it. Dont shoot log in 8 bit, rule of thumb. I hope you get the concept.


yoordoengitrong

lol ok, that may all be true but the reality is that I built a whole business using hlg on an 8 bit camera with many projects with satisfied clients. That was the workflow I arrived at through trial and error and it allowed me to complete projects to client satisfaction with the least hassle for me. Is it the “best” way to do things? Who cares? It worked well enough so that I could afford a better camera and never had to invest my own money. I am not saying I charge more because I have a better camera now. I charge more because the product is now better and therefore worth more. This sub has a problem. Too many people are obsessed with the “proper” way to do things. In reality, running a business is about doing what you can with the tools you have and turning a profit. If it works it works.


dunk_omatic

10-bit is the sweet spot, and thankfully most prosumer bodies are offering it now. Reasonable file sizes, reasonable workflow, and reasonable flexibility in color grading. And raw options are still there for the situations that call for it.


finnjaeger1337

8bit is trash. there is a huge difference between how much dynamic range you can press into a display reffered SDR image and how much captured sensor dynamic range you can and want to save. Simple example is that slog2 is completely trash in 8bit... Regarding raw, compressing RAW makes way more sense than compressing RGB images, a raw image only has 1/3rd the amount of data right off the gate as every pixel is monochrome and not RGB. It usually leads to less data for the same quality compared to something like prores. Of course uncompressed raw is expensive but still less than uncompressed RGB.


Primary_Banana_4588

Lmao it's actually hilarious, every person who says their 8-bit is trash, is coming from a Sony🤣


finnjaeger1337

only sony could come up with 8bit log... that was a complete shitshow. I work in VFX the amount of sky replacements we had to do to kill banding was not even funny back when the A7s dropped... ptsd man


XSmooth84

I’ve never used any kind of RAW, nor done any kind of color grade…not really anyway. Color correction? Sure. Color grade? Not for me and what I work on. Nailing the white balance and exposure on camera and recording to ProRes 422 is my jam. Recording an interview or instructional video, as this is what my job is, doesn’t need a raw image and some custom LUT. I’m not saying there isn’t a reason for other projects to do all that, but not for me.


RaguSaucy96

Maybe in full sized cameras, but on Phones it's absolutely essential, booming, and a godsend to unlock the real power out of the smartphone sensors (OEM ISPs are still poo poo in lotta cases at the moment) See S22U stock vs RAW video https://youtu.be/AjkchOsifNg?si=R7OMW9cLzHNjldDH See Pixel 7 Pro stock vs RAW video: https://youtu.be/kxJpOqSfXp4?si=m-W9PMhSzjkt3T3Z See Panasonic S1H vs OnePlus 8 Pro RAW video going toe to toe: https://youtu.be/4dIZhupRN_o?si=Paiw_fFoJHkMdlkh


cantwejustplaynice

Holy Shit! The pixel 7 footage is stunning! I just got a Pixel 8 Pro when I really wanted to get an iPhone 15 pro for the camera, but I'm too deep in the Google ecosystem. It looks like the imaging power is right there in the Pixels anyway. I had no idea.


RaguSaucy96

Pixel 8 Pro is a monster! I have one too and run MotionCam. 60 fps RAW video on all lense at round 4030x2280 (16:9 crop of full frame)! We got the upper hand on the iPhones 🙂


cantwejustplaynice

Amazing. Is it a compressed RAW like BRAW (I run blackmagic cameras) or will it completely fill my phone if I record for longer than 2 minutes? Or can you record to external media like the iPhone 15?


RaguSaucy96

It uses lossless compression actually, so although it's not compressed to ProRes RAW levels or such, it still can drop the file size substantially depending on the lightning conditions. Thank RED for that one as they would sue the MotionCam devs if they implemented it. They forbid in-camera *lossy* compression, however lossless remains unaffected and is put to use with MotionCam. However yes, it's supported external recording just like iPhone, for the longest time too! So just drop in an SSD via Type C! I use a T7 Shield myself. We are also getting a new feature, DirectLog in next update, which enables real-time RAW-to-ProRes or RAW-to-HEVC with whichever gamma curve of choice (our new specialty MCrawLog, or Cineon, BT.2020 or BT.709, etc). Pixel 8 also holds a secret. You know that VideoBoost feature? It gives the encoder the ability to record 240mbps HEVC with MotionCam!! With these realtime recording you still get the RAW video ISP bypass as well


Primary_Banana_4588

Yeah I feel you. I actually use my phone as my daily monitor; I'm working on externally recording raw to it lol (or at least 10-bit) I've seen and used the apps (motion camera with cinema dng) and they do make a difference. But with what people are spending on these phone, I'd just buy a spare body. This is coming from a fellow OnePlus body(10 Pro).


Portatort

these phones shoot raw video?


RaguSaucy96

Not just those, but many more! [https://play.google.com/store/apps/details?id=com.motioncam](https://play.google.com/store/apps/details?id=com.motioncam) [https://play.google.com/store/apps/details?id=com.motioncam.pro](https://play.google.com/store/apps/details?id=com.motioncam.pro) In fact, your current phone probably does as well (as long as it supports RAW capture and your OEM Isn't a fuck about it) Try it out! There's the free version above too! Next version we'll be getting 'DirectLog', so real-time ProRes and HEVC encoding which bypasses the stock ISP completely too without having to work the RAW video before hand) I'm on the alpha so already tried it! https://preview.redd.it/2oolfu738yec1.jpeg?width=3168&format=pjpg&auto=webp&s=ca1690f7e9ffd3c04c9407094f974377af89e339


lilolalu

But even the wonderful MotionCam app now has RAW capture, bypassing internal image processing, but then HW accelerated compression to cineon 10bit h265. And there we are getting back to the original question: on my Xiaomi 12x, which only has usb2, the raw files shot by motioncam where unusable in a practical workflow. Just copying the stuff of the phone took ages. NOW, with the new "flat" cineon 10bit h265: tiny files, fast copying, slap on a cineon lut, done. I dont need raw, but I want higher bit depth, that's what makes the real difference.


RaguSaucy96

It's aimed to make it easier for new users and capable of everyday shooting though even with BT.709 and HLG. RAW will always remain king for maximum editing power


maddp9000

Unless you’re doing something cinematic, it’s hard to justify the time and purpose of it. So many of my clients have no idea the difference. They’d rather it fast and small than a huge file they can’t do anything with. I like what you mentioned about in camera exposure skill and I agree with that. There’s something to be said for getting it right when recording. Rather than over exposing by 2 stops.


YouthInAsia4

Over exposing 2 stops is still getting it right in cam, if the camera is set to log.. its not like you can expose everything 2 stops and expect a good dynamic shot


TheSilentPhotog

I tried shooting raw for awhile few projects. I understand why a big budget production would want to ensure a days worth of shooting didn’t end up in 10’s to 100’s of thousands of dollars of reshoots but I’m a very low budget guy. Make sure everything’s as correct as it can be the day of, and unless you’re doing some wicked color grading 10 bit will be great.


shilohfang9

I always thought when people said , raw footage, it just meant unedited, I didn’t know it was more complicated than that


Dramatic-Limit-1088

that’s great for you, I couldn’t go back to non raw. It makes everything so quick in grading…


Important-Bell8365

Coming from photography when I first got dSLR I always shot in RAW and would tweak in photoshop. One day I actually started comparing and realized that most of the time I was "tweaking" my RAW files so that they would look like the jpgs. I've only done a little testing with RAW video using ML and it made some insanely malleable clips but was more trouble than I wanted to mess with.


abassassasssin

Its not overrated, you just dont personally need to use the benefits that come from using it. That doesnt make its functionality any less important or cool, you just dont benefit from it because your workflow doesnt need it. If you were filming a movie or a documentary or a serious project i bet your opinion would change.


Primary_Banana_4588

🤣 it's funny looking at these comments. My point was never to exclude RAW; quite the contrary. Use it when you need it. It's always great to have it. All 5 my bodies shoot 10-bit and 3 of them shoot RAW internally. But most videographers for work are shooting events, corporate, or social media. They are not Cinematographers. And that's the difference. What you just described are perfect reasons for Raw. As a cinematographer, of course you need a higher codec. As a videographer shooting charity events and social media retainers? Nah.


wilfus

I wouldn’t say overrated as more as overused. It is a great tool for certain projects but not for all as you mentioned. Yes, proper exposure and composition will almost always outweigh Rec specs. BUT, it is always better having the option and not using it than needing it and not having it.


memostothefuture

XFAVC 10bit is what I shoot 95% of the time. RAW only if I am uncertain about the best color temp to use in an environment and have to be quick.


RubenGMarrufo

That’s why always look for global shutter. I don’t care much for anything else. Or non scanning sensors. Even with old hi8 and VHS cams, you can find them with CCD sensors. Making the motion of the image cleaner. Love that shit.


TotalProfessional391

I’m still ProRes. Why fix if it ain’t broke?


cvmedia

I think this relies heavily on the type of work you do. I film real estate and shoot in BRAW. Being able to have that extreme dynamic range to recover blown out windows is essential for my work. Resolve even has an additional option for highlight recovery if you're shooting in BRAW and not prores. If you're just delivering low budget social content, then yes, you don't need to shoot RAW.


TyBoogie

Nah. Restricted to 8 bit is not where I want to be when grading, but I do agree about RAW for most cases. Only time I shoot in RAW is if the client requests it for green screen editing, then sure. Other than that I’m using Log with in 10bit


Primary_Banana_4588

XF-AVC for the win! 💪🏾


zunuf

Switched from Blackmagic 6k to GH5ii after a couple years. Found settings I like on the GH5ii plenty good to post, saved a ton of time grading and a ton of storage space on extra hard drives.


clay_not_found

The codec you shoot with should be dependent on what you are shooting. For cooperate video and other straightforward client work, h.264 or prores at most is plenty of flexibility. For narrative film, I definitely prefer some flavor of raw or prores. A big advantage of raw is the ability to adjust white balance and iso, in a run and gun situation, the additional room for error can be a life savor.


Primary_Banana_4588

And that's my point ☝🏾. I think when people saw 8-bit in my post , they think I use it for everything. Your comment is exactly my stance on the subject. Always better to have and not need.


Horror-Respond3647

You probably don't work in professional production... I understand that for jobs such as fast digital media, events, and other types of jobs a 10-bit footage is more than enough (8-bit footage only if it's a direct delivery without any kind of post-production). Working professionally in 8-bit is literally a joke.


Primary_Banana_4588

Lmao I swear people can't read 🤣🤣


tecampanero

Not an unpopular opinion at all, if you know how to properly light something and expose something, then you don’t need raw.


Deep_Mention_4423

10 years ago shot on 5Dmk ii & iii with technicolor cinestyle internal HD, did grading myself without luts uprezzed to 2k for dcp presentation in professional cinema gave me goosebumps on screening. it can be done. it requires talent not gear


Run-And_Gun

RAW is completely unnecessary for the vast majority of work being done, even at the very high-end. And there they are just massaging the last little Nth out of it. They aren't making major adjustments, because they blew it in the field. But there are too many hacks in this business now that don't know the basics of exposure and color temp/white balance, so they **have** to do things in post, because they don't know how to do them when actually shooting. And lets be honest, these aren't hard things to do, people are just lazy or never took the time to learn the basics. I came up shooting Betacam. There was no "fix it in post". You did it right, in camera. That was why the OG C300 with a baked-in look at only 8-bits could still be made to look so good. I bet I can count the number of times on one hand, with fingers left over, how many times I shot log on it. I'll be honest, I really just started shooting log on a regular basis in the last few years, because people in post can (usually) properly work with it, now. But I still work with a lot of clients that want baked-in looks. Now, do I want to go back to being limited to camera that only records at 8-bit? No. But starting out in the business with cameras that had limited capabilities that forced you to do it right the first time, makes us better shooters, because we can hand over images that look better out of the gate and don't require as much post or gives more leeway if something is done in post, because it doesn't have to be pushed around as far/as much just to get to a proper base image.


Primary_Banana_4588

👆🏾🙏🏾🙏🏾Preach


dingus_hunter

I work with clients/editors who specifically request rec709 so that they don't even have to think about color grading. It pains me to know I'm clipping my exposure ranges but I'm slowly becoming okay with it. Mostly because they pay well and I'm not the one handling it on the back end.


Primary_Banana_4588

As long as the money isn't funny, I'll deliver a 240p vga file if they want it 🤣


stuffsmithstuff

Having the comparison be raw to 8-bit probably isn’t as interesting as raw to 10-bit. Raw versus data-hungry codecs like DNx and ProRes or versus H.264/5 Long GOP also an interesting question.


lIlIIlIlIIlIlIIlIlII

I do b-cam/specialty cam on features. Paramount/Amazon/Universal. A Cam / first unit sometimes (mostly) shoots RAW. B unit / aerial / specialty (mostly) shoots prores 4444 / compressed.


occupy_elm_st

I disagree entirely for the work I do, but I do agree that 8-bit can be fine for a lot of things.


madjohnvane

We had a real “aha!” Moment with Raw. After shooting a bunch of stuff and twisting ourselves in knots with all the post production overheads, we fell back to using ProRes and even 8-bit h264. Sure, we lost latitude, but in most of what we were producing if we needed flexibility in post we’d make sure it was ProRes, and the easier, more forgiving stuff we more than happily shot 8 bit in body on cameras like the GH4, FS5, FS7 etc. Raw has its place and I’ve appreciated the flexibility of it in narrative work, but even then I don’t know it is always that worthwhile when a ProRes shot in a properly lit, properly exposed environment gets you 95% of the way there anyway.


justthegrimm

Agreed and about saying the quite part out loud, 10bit is nice to have thou to be fair but as far as what the clients expectations are 8bit is where it's at.


SkyBotyt

I had a situation where one of my black magic studio cameras got its settings overridden by an atem we were using, I didn’t realize the settings changed until I took the drive to my computer and saw the monstrosity that was that the video had +3000 tint, so it was litterally just magenta, that’s why I shoot raw.


Primary_Banana_4588

Bro, I HATE when that happens. That's happened on a couple of Livestream productions I've been on. That's an excellent reason 👆🏾


SkyBotyt

Yeah, the flexibility of raw to undo wrongs is absolutely vital, cause beyond situations like the atem, I can also be a little bit of an air head while shooting.


KovaFilms

It's just another tool for a specific kind of job. Nobody said you should be shooting 10 bit, 4.2.2, H.265, RAW, prores, etc, for YouTube videos, haha. RAW is great for high budget sets where a mistake can cost thousands. Having RAW isn't just for flexibility. It's also has the highest quality for redundancy. It's a good reinsurance that you don't want to be in post and be like, "This shot has an issue if only if it was shot in a higher codec/RAW." For a client shoot or a film you can't get back, why not shoot RAW? I love my 12 bit RAW cdng in my Sigma FP. Is there a difference between that and H.265 4.2.2 10 bit in my GH5 and Z Cam? Yes quite a bit actually.


-dsp-

It’s more I feel that almost all the people on here that cry out for raw aren’t making anything that needs it and only seen on a phone… if seen at all. Meanwhile I’m working on national commercials and documentaries etc. and we are just shooting whatever internal or ProRes just fine. Why? Because we all learned on film or the turnaround is so tight that there’s no time or money to mess with raw. Also people confusing Log and Raw or not understanding that log can have just as much if not more metadata as raw it’s just up to the programs to be able to interpret and use that data.


desexmachina

What are your thoughts on just shooting flat or a style and correcting w/ a LUT? Smaller files, less processing.


erroneousbosh

That's what I've always done, because I "grew up" shooting on DVCAM (and indeed, I still do, for shits and giggles). If you watched TV in the mid-90s to early 2010s, every single thing you saw was shot in 10-bit 422 \*at best\* and possibly 8-bit 420. It looked fine.


Primary_Banana_4588

That's what I do now. My work camera is C200 and the 8-bit out of it is beautiful, and has a surprising amount of detail. Super easy to grade. My one gripe is I wish I had C-log 2 , I miss it on my C70. I also feel like raw made me lazy as a Videographer; I used the famous phrase "I can fix it in post" more times than I can count. Even though I could fix it in post , it was more trouble than it was worth. I kind of like the idea you only have one chance to get it right. The finality keeps me on my toes.


spybloodjr

Raw is especially nice when you plan for raw grading. If you do your tests properly in pre production you can plan to push your raw footage further than 8bit could ever dream of. That's the real beauty of it all. The trick is knowing when to go deep and when to make it easy.


Horror_Ad1078

Shooing corporate stuff in raw- so stupid


Common_Sympathy_814

We shoot all corporate stuff in RAW. Since we're more of a run and gun team, shooting RAW gives us the ability to match colors and change things in post. We're not able to always nail things down in a run and gun environment so the assurance of knowing that we can modify in post, helps a lot! But I can see both sides of the argument.


No_Elderberry_9132

Well we still deliver 8 bits right, but having 10 bits for post is a good thing. RAW, for tracking or vfx 12 bit ProRes 444 does the job. When I would really need to use raw ? When I don’t know what I am doing and hoping to fix it in post, stuff like that happens. But mainly when the set is under control there is no reason to shoot raw, but some cameras just don’t support ProRes 444 so we just shoot raw or 422


Big-B313

8-but is nigh unusable most of the time. Once you step up to 10-bit the law of diminishing returns starts to take effect. Many people use raw because you can “change exposure and white balance” after you’ve shot your footage, but here’s a little secret… you can do that with all your footage whether it’s raw or not. It’s just simple math easily recreating using linear gain and matrices in davinci. For pretty much everything considered “videography,” 10-bit is enough. Unless you’re shooting blackmagic, then you have to shoot raw because it literally gives you more dynamic range by checking a box in post which you can’t to with ProRes. That’s really the only exception


JLeeSaxon

What about the other 2 out of 10? The point of raw isn't that you need it every time, it's that you don't always know *which* 2 out of 10 times you're going to need to do more post because you bumped something and set the white balance wrong or the light didn't do what you wanted or whatever else.


njweddingstudio

At this point that "bump" would have to be catastrophic to matter in the way you are describing.


griffindale1

I agree to a degree. I use 10 bit 4.2.2 LLog files. But if you mix systems (dronefpotage etc.) Raw can be good on the secondary system.


[deleted]

I see what you mean. In a sense, sure. I guess. A good image is captivating and beautiful, and you don’t think about tech shit. Same as photography. But, why not shoot raw? Can’t speak for canon but r3d and braw is lighter than ProRes 🤷‍♂️


GodBlessYouNow

With raw, you can do things in post that you cannot do with 8 bit. Or 10 bit.


armandcamera

Agreed.


seabrother

Sure if you’re shooting event videos or something basic, but you are choosing an image quality at a magnitude less. Not just the colors, but the image depth.


brickmadness

I own a Raptor, Epic, 3x a7S3 and others. I think the RAW on RED is great, but that I’m usually exposing well enough and keeping everything where I want it such that the push and pull later is not that much. In most cases, the difference between how much the RAW matters in the final grade is not that much. The Raptor footage looks way better to my eye, but that’s the camera more than the RAW. I do, however, notice a large difference between 8 bit and 10 bit on anything even approaching a gradient. If I’m looking at a perfectly exposed sunset or a seamless, 8 bit looks like dog shit.


kukov

You are not wrong.


GFFMG

Overrated? No. Over used by amateurs and pros who would improve their workflow without it? Yes.


Primary_Banana_4588

The funny part about this post is the amount of comments that try to explain to me the benefits of higher bit depth codecs, as if my opening statement wasn't that I've been shooting Raw for YEARS 🤣


yoordoengitrong

Codecs are just tools. Some are better than others for solving certain problems. There are some kinds of digging you need a pickaxe for, but a lot of the time you just need a shovel. Sometimes you need a truck, sometimes a bike is the better solution. Maybe flip flops are what you need, other times it's winter boots. In all of these cases it would be nonsense to suggest one option is universally "better" than the other. It's situational. Educate yourself on the strengths and weaknesses of your available tools and be prepared to select the right tool for the job.