5 THINGS: Simplifying Film, TV, and Media Technology

5 THINGS: Simplifying Film, TV, and Media Technology

Podcast by Michael Kammes

Michael Kammes answers 5 things about technologies and workflows in the media creation space. Production, post production, reviews, gear, tech, and some pop culture snark.

Start 7 days free trial

99,00 kr. / month after trial.Cancel anytime.

Start for free

All episodes

53 episodes
episode Editing Remotely with Avid Media Composer artwork
Editing Remotely with Avid Media Composer

All of you are asking the same thing. “How can I edit remotely or work from home?” Today we’ll look at Avid [http://avid.com], as they have many supported options, so you can cut with Media Composer [https://www.avid.com/media-composer] from just about anywhere. Let’s get started. 1. EXTENDING YOUR DESKTOP The first method we’ll look at is simply extending your desktop, that is – having your processing computer at the office, while you work from home and remote into that machine. This has been the crutch that most facilities have relied on in the past few weeks. Let’s examine how this works. First, this scenario assumes that you edit at a facility, where all of the networked computers and shared storage are…and that you can’t take any of those things home. This can be due to security, or other concerns like needing access to hundreds of TB of data. In this case, the creatives are sent home, and I.T. installs a remote desktop hardware or software solution on each of the machines. The creatives then connect through a VPN or virtual private network [https://en.wikipedia.org/wiki/Virtual_private_network] – to gain secure access from their home editing fortresses of solitude back into the facility and attempt to work as normal. S03E06_5THINGS_Editing Remotely with Avid Media Composer 1 Extending your desktop [https://5thingsseries.com/wp-content/uploads/2020/04/S03E06_5THINGS_Remote-and-Work-From-Home-Editing-with-Avid-Media-Composer-1-Extending-your-desktop.gif] [https://5thingsseries.com/wp-content/uploads/2020/04/S03E06_5THINGS_Remote-and-Work-From-Home-Editing-with-Avid-Media-Composer-1-Extending-your-desktop.gif] Now topically this sounds like a real win-win, right? You get access to your usual machine and usual shared storage. Sure, you lose things like a confidence monitor (if you had one), but you should be fine, right? The devil, as always, is in the details. Typical screen sharing software solutions that are installed on your office editing machine are often dumpster fires for creatives. I’m not saying they are bad for general I.T. use, or when you need to remote in and re-export something, but by and large most screen sharing protocols do not give a great user experience. Full frame rate, A/V sync, color fidelity, and responsiveness usually suffer. Solutions like Teamviewer [https://www.teamviewer.com/], Apple [https://support.apple.com/remote-desktop] or Microsoft Remote Desktop [https://www.microsoft.com/en-us/p/microsoft-remote-desktop/9wzdncrfj3ps?activetab=pivot:overviewtab], VNC [https://en.wikipedia.org/wiki/Virtual_Network_Computing], or most any of the other web-based solutions fail. Hard. You’ll pull all of your hair out before you finish an edit. Moving up to more robust solutions like HP’s RGS – Remote Graphics Software [https://support.hp.com/us-en/document/c02163749] – or a top of the line solution like Teradici’s PCoIP software [https://www.teradici.com/what-is-pcoip] – is about as good as you’re gonna get. The license cost may cost a few hundred dollars, too…depending on your configuration. But here’s the kicker. They’re Windows only as a host. While you can access the computer running the Teradici software with a macOS or Windows equipped computer – or even via a hardware zero client –  the environment you create in will always be a Windows OS. Quite unfortunately, there does not exist a post-production-creative friendly screen sharing solution for macOS. The only solution I’ve come across over these many years is a company called Amulet Hotkey [https://www.amulethotkey.com/] – yes, that’s their name – who take the Teradici PCoIP Tera2 card and put it into a custom external enclosure [https://www.amulethotkey.com/products/kvm-extender-and-remote-workstation-cards/kvm-extender-hosts-pcie-cards/#tech3]and add some secret sauce. You then feed the output of your graphics card, plus your keyboard, mouse, and audio into the device and the PCoIP technology takes over. It’s quite frankly the best of both worlds: PCoIP and macOS. Amulet Hotkey DXT-H4_QSG_QS-THA4-1110 (rear) [https://5thingsseries.com/wp-content/uploads/2020/04/DXT-H4_QSG_QS-THA4-1110-rear-1024x584.jpg] [https://5thingsseries.com/wp-content/uploads/2020/04/DXT-H4_QSG_QS-THA4-1110-rear.jpg] This ain’t gonna be cheap. Expect a few thousand dollars per hardware device, and availability at the moment may be difficult. You also going to need to do some network configuration for Quality of Service [https://www.teradici.com/web-help/TER1105004/Sess-Plan_Admn-Gd.pdf], and then decide how you’re going to “receive” the screen share at home, either on a laptop/desktop or with a zero client [https://www.teradici.com/products/desktop-performance-solutions/zero-clients]. 2. ISLANDS OF MEDIA There is no doubt that as a Media Composer user, you’ve already tried this. It’s the easiest and least expensive way to have multiple users working on Avid projects at the same time while not in the same building. Wait a second. Let’s back-up and review how this works before we jump into the nitty-gritty. We start off as we did before, with everyone working at the facility on networked computers and shared storage. We then begin to replicate the media from the facility on to portable drives. S03E06_5THINGS_Editing Remotely with Avid Media Composer 2 islands of media [https://5thingsseries.com/wp-content/uploads/2020/04/S03E06_5THINGS_Editing-Remotely-with-Avid-Media-Composer-2-islands-of-media.gif] [https://5thingsseries.com/wp-content/uploads/2020/04/S03E06_5THINGS_Editing-Remotely-with-Avid-Media-Composer-2-islands-of-media.gif] Obviously, this could be a security risk and could potentially break the contract your facility might have with the content providers. But that’s a soapbox I’ll get on later in the episode. Replicating the media will require some serious media management. This means an accounting system to track who has what media – as well as a tight versioning schema. You’ll need to look at syncing schemas to get new media out to users, unless the editor is coming back to the facility for updated content. This may or may not also include someone at the facility creating lower resolution proxies so editors can go home with a few TBs of media instead of dozens of TBs of media. Often this may include watermarks on the media as yet another level of accountability in the event of a leak. Once that media is at home with the editor, there has to be a standardization on folder organization, naming conventions, and an extreme amount of attention paid to media management. Avid has always had a fantastic way of project management. Projects link to any number of bins, and those bins contain sequences, and both point to media. This means bins are small files that can be emailed or dropboxed or otherwise shared with other users quite easily. Provided each user has the appropriate media, Media Composer can be coaxed into relinking to the media when a new bin is loaded. This workflow does present some gotchas. Any rendered files will most like need to be re-rendered on each machine, and multiple users can’t work on the same bin or at the same time with the usual red lock/green unlock ability, so there does have to be some communication so as not to mess up someone else’s work. Avid_bin_locking_and_unlocking [https://5thingsseries.com/wp-content/uploads/2020/04/3.Avid_bin_locking_and_unlocking.png] [https://5thingsseries.com/wp-content/uploads/2020/04/3.Avid_bin_locking_and_unlocking.png]Avid Bin Locking – Red Lock & Green Unlock Obviously you’ll need to have a computer with a licensed copy Media Composer, plus the plugins you may need. Maybe you’ll have an awesome company like the reality post facility Bunim Murray who sent the editors home with their edit bay computers. Or, maybe you’ll have to grit your teeth with that old laptop collecting dust in the garage. Or, maybe – just maybe – you have the time and budget for the next few solutions. 3. VIRTUALIZATION AND EXTENDING: MEDIA COMPOSER | CLOUD VM This next solution is new-ish to the Avid family and has traditionally only been something you did within the 4 walls of your facility, and out to your edit bays – not to your home. Avid Media Composer Cloud | VM [https://www.avid.com/media-composer/cloud-vm-vs-cloud-remote] BTW, trivia tidbit – the vertical line you see in many Avid naming conventions? It’s called a Pipe. Make of that what you will. Media Composer Cloud | VM involves investing in a stack of servers at your facility and running VMs, or virtual machines, on these stacks of servers. On these VMs, a specially licensed copy of Media Composer Ultimate runs. Now, typically this is done so only everyone within the facility can access their computers, Media Composer, and shared storage from anywhere in the facility, and I.T. can administer everything from 1 location. No need to have computers in each of the bays. It’s sort of like the old tape room methodology. S03E06_5THINGS_Editing Remotely with Avid Media Composer 3 Virtualizing and extending Media Composer Cloud VM [https://5thingsseries.com/wp-content/uploads/2020/04/S03E06_5THINGS_Editing-Remotely-with-Avid-Media-Composer-3-Virtualizing-and-extending-Media-Composer-Cloud-VM.gif] [https://5thingsseries.com/wp-content/uploads/2020/04/S03E06_5THINGS_Editing-Remotely-with-Avid-Media-Composer-3-Virtualizing-and-extending-Media-Composer-Cloud-VM.gif] As everyone is in the same facility, latency is cut down, plus, using tools like Teradici’s PCoIP software solution give the user a very fluid creative experience. Recently, some facilities have been asking the question: “if our users are simply remoting into the VMs while they’re here in the office, why can’t they do the same at home?” As you can imagine, quality of service and the user experience are paramount in the Avid world, so this was usually discouraged. In fact, as of this video, Avid still only recommends this for users within a facility. But that hasn’t stopped folks from trying it and using it. So, users go home, and on their laptop or desktops, they load up their Teradici PCoIP software client, and connect via a VPN back to the facility mothership and continue working – much like we covered in option #1: Extending your desktop. There are a few caveats, however. This is not a “light ’em up tomorrow” solution. This requires specific servers, specific switches, and specific builds. Expects tens of thousands of dollars. It also requires upgraded licenses…not just for Media Composer, but many 3rd party software solutions and plugins either don’t handle virtualization well or want to charge you for the privilege. And playing with 3rd party storage may not be pleasant. And since you use the Teradici PCoIP handshake to access the VMs, the virtualized environment is *only* Windows. Ahh, ahh – stop it. There’s no crying in tech. Teradici PCoIP, while a fantastic protocol, does have some limitations when it comes to creatives. Higher-end color grading may be difficult, as PCoIP is limited to 8bit viewing, although the newer Ultra variant does have 10bit capability. Audio is usually limited to stereo playback, and don’t expect a confidence monitor output – it’s just the computer GUIs. But for most editing purposes, it’ll work just fine. 4. STREAMING PROXIES: MEDIA COMPOSER | CLOUD REMOTE This solution has actually been around for many years, but it was just called something different and was mainly found within news type deployments. The premise is pretty elegant. If Avid MediaCentral UX [https://www.avid.com/products/mediacentral-ux] – formerly branded as Interplay – was already your asset management du jour and managing and tracking your media at your facility, why couldn’t it serve up that media to you on-demand wherever you were? Media Composer Cloud Remote With Media Central [https://5thingsseries.com/wp-content/uploads/2020/04/MC-Cloud-Remote-1024x512.jpg] [https://5thingsseries.com/wp-content/uploads/2020/04/MC-Cloud-Remote.jpg] And thus, Avid devised what is now known as Media Composer | Cloud Remote [https://www.avid.com/products/media-composer-cloud-remote]. Media Composer Cloud | Remote [https://www.avid.com/products/media-composer-cloud-remote], when coupled with MediaCentral | Production Management [https://www.avid.com/products/mediacentral/mediacentral-production-management], allows servers at your facility to serve up real-time proxies of your on-premises media out to your remote machines. Editing Remotely with Avid Media Composer 4 Streaming Proxies Media Composer Cloud Remote [https://5thingsseries.com/wp-content/uploads/2020/04/S03E06_5THINGS_Editing-Remotely-with-Avid-Media-Composer-4-Streaming-Proxies-Media-Composer-Cloud-Remote.gif] [https://5thingsseries.com/wp-content/uploads/2020/04/S03E06_5THINGS_Editing-Remotely-with-Avid-Media-Composer-4-Streaming-Proxies-Media-Composer-Cloud-Remote.gif] The end-user has a full version of Media Composer and connects over a VPN back to the facility. The user checks out a project from the server, which is linked to media on Avid NEXIS shared storage back at the facility. When the local version of Media Composer attempts to access media, the local software retrieves that media from the server at the facility and streams a low res version in real-time to your local machine. Media Composer Cloud Remote Proxy Settings [https://5thingsseries.com/wp-content/uploads/2020/04/MediaComposerCloudRemoteReadMe_2018_6.jpg] [https://5thingsseries.com/wp-content/uploads/2020/04/MediaComposerCloudRemoteReadMe_2018_6.jpg] You are literally playing media in your timeline that’s being served up from hundreds or thousands of miles away – all within the familiar environment of Avid. Here’s the best part: Your local copy of Media Composer can run on either Windows *or* macOS. Avid Media Composer Cloud | Remote also has the added bonus of allowing remote users to work with their local media and then uploading in the background that local media back to the facility and checking it into the shared storage so others can access that media. It’s a pretty slick solution. However, Cloud Remote is simply a software option on top of Avid’s MediaCentral | Production Management solution. Meaning: remote editing with streamed proxies is not the main selling point of the system – it’s the asset management, automation, and collaborative features that drive companies to invest in it, with remote editing as an add-on. It’s like buying a house because it has a really cool garage. It’s also not cheap, nor easy to administer. You’re going to need an Avid ASCR to handle it, plus stacks of servers and storage. Expect over $100K to get going. Once configured however, it’s tech elegant and very cool. 5. COMPUTER, STORAGE, AND MEDIA IN THE CLOUD: AVID | EDIT ON DEMAND Edit on Demand [https://www.avid.com/products/avid-edit-on-demand] was a relatively quiet beta solution by Avid, and it’s only really been viable for the past year. It’s available in early access and “by-request-only”. In essence, Avid | Edit on Demand is accessing Media Composer software that is running in a public CSP – or cloud service provider. In this case, Microsoft Azure [https://www.avid.com/microsoft]. The application, the NEXIS storage, and everything you need to edit with is in the cloud – and you access it via a laptop, desktop, or a zero client. It’s very similar to the method I talked about earlier: Virtualizing and Extending with Media Composer | Cloud VM. The difference here is that everything is in the cloud – not at your private facility. Let’s take a look. We’ll start where we’ve started every workflow: the traditional edit-at-a-facility. We then take that infrastructure and virtualize it within the Microsoft Azure cloud – and only on Windows desktop. Editors get to use Media Composer Ultimate and they get access to NEXIS storage, so bin locking and project sharing works as you would expect it to. It’s then business as usual. S03E06_5THINGS_Editing Remotely with Avid Media Composer 5 Avid Edit on Demand [https://5thingsseries.com/wp-content/uploads/2020/04/S03E06_5THINGS_Editing-Remotely-with-Avid-Media-Composer-5-Avid-Edit-on-Demand.gif] [https://5thingsseries.com/wp-content/uploads/2020/04/S03E06_5THINGS_Editing-Remotely-with-Avid-Media-Composer-5-Avid-Edit-on-Demand.gif] Like the other virtualized scenarios, this solution is also built on Teradici’s PCoIP, so the user experience is fantastic. Avid has also partnered with File Catalyst to enable users to upload and download content to and from the cloud NEXIS. This type of solution allows productions to scale up and down quickly as needed and plays right into OpEx business models…which the film and TV industry is famous for. At last check, Avid was able to get these workstations and environment ready in under a week. And because it’s in the cloud it allows users to work all around the world. The usual caveats apply: there is no video monitor output currently, and audio is still limited to stereo audio. And no, Pro Tools is not supported in any cloud. It’s also not easy to get other applications installed on the workstation or direct administrator access to the NEXIS storage. Pricing starts around $3000 per user per month, assuming under 200 hours of workstation a month. You can also buy hours and users in bundles, as well as TBs of storage. And while this price is a little steep for freelancers, it certainly is the fastest way to get to scale quickly and without a large CapEx investment. BONUS EDITORIAL While available technology certainly frames what can and cannot physically be done, we’re still hampered by stagnant security restrictions and guidelines, and outdated work environment expectations. Many of the security requirements that are jammed into contracts are outdated and have not been revised, namely because no one wants to question it and potentially lose a large client. The looming worry of a hack, which is more often than not simply a password that was socially engineered, creates paranoia and the stacking of security protocols on top of more protocols. Mark my words, you’re gonna see a serious revamp of security recommendations after this is over, and quite frankly, before this is over. I’m also hopeful that once these security guidelines are revisited and revamped, that facilites begin to realize that if you hired someone to represent your company, your brand, and generate the work that gets you paid, you’d trust them to work remote in some capacity. Have more Avid Remote Editing concerns other than just these 5 questions? Ask me in the Comments section. Also, please share this tech goodness of this *entire* series with the rest of your tech friends….they’ve got nothing else better to do right now. Until the next episode: learn more, do more. Like early, share often, and don’t forget to subscribe. The post Editing Remotely with Avid Media Composer [https://5thingsseries.com/episode/editing-remotely-with-avid-media-composer/] appeared first on 5 THINGS - Simplifying Film, TV, and Media Technology [https://5thingsseries.com].

19. apr. 2020 - 16 min
episode An Intro to Using the Cloud for Post Production artwork
An Intro to Using the Cloud for Post Production

On this episode of 5 THINGS, we gonna get hiiiiiigh! In the clouds, with a primer on using the cloud for all things post-production. This is going to be a monster episode, so we better get started. 1. WHY USE THE CLOUD FOR POST-PRODUCTION? We in the Hollywood post industry are risk-averse. Yes, it’s true my fam, look in the mirror, and take a good hard look and realize this truism. Take the hit. This is mainly because folks who make a living in post-production rely on predictable timetables and airtight outcomes. Deviating from this causes a potentially missed delivery or airdate, additional costs on an already tight budget, and quite frankly more stress. The cloud is still new-ish, and virtually all post tasks can be accomplished on-premises. So why on earth should we adopt something that we can’t see, let alone touch? Incorporating the cloud into your workflow gives us a ton of advantages. For one, we’re not limited to the 1 or 2 computers available to us locally. This gives us what I like to call parallel creation, where we can multitask across multiple computers simultaneously. Powerful computers. I’m talking exaFlops, zettaFlops, and someday, yottaFlops of processing power….and have more flopping power [https://en.wikipedia.org/wiki/FLOPS] than that overclocked frankenputer in your closet. Yeah, I said it. Flopping power. It’s also mostly affordable and getting cheaper quickly. To be clear, I’m not telling you post-production is to be done only on-premises or only in the cloud, most workflows will always incorporate both. That being said, the cloud isn’t for everyone. If you have more time than money, well then relying on your aging local machines may be the best economical choice. If your internet connection is more 1999 than 2019, then the time spent uploading and downloading media may be prohibitive. This is one reason I’m jazzed about 5G [https://en.wikipedia.org/wiki/5G]…but that’s another episode. Now, let’s look at some scenarios where the cloud may benefit your post-production process. 2. TRANSFER AND STORAGE Alright, let’s start small. I guarantee all of you have used some form of cloud transfer service and are storing at least something in the cloud. This can take the form of file sharing and sync applications like Dropbox [https://www.dropbox.com/], transfer sites like WeTransfer [https://www.dropbox.com/], enterprise solutions like Aspera [https://asperasoft.com/], Signiant [https://www.signiant.com/], or File Catalyst [https://filecatalyst.com/], or even that antiquated, nearly 50-year-old format known as FTP [https://tools.ietf.org/html/rfc114]. Short of sending your footage via snail mail or handcuffing it to someone while they hop on a plane, using the internet to store and transfer data is a common solution. The cloud offers numerous benefits. Hard Drive Life Expectancy - 20% dead after 4 years [https://5thingsseries.com/wp-content/uploads/2014/09/S02E05_archive_1-harddrive-life-span-1024x576.png] [https://5thingsseries.com/wp-content/uploads/2014/09/S02E05_archive_1-harddrive-life-span.png]Hard Drive Life Expectancy First is what we call the “five nines”, or 99.999% availability. This means that the storage in the cloud is always available and with no errors, with a max downtime of about 5 ½ minutes a year. In the cloud, five-nines are often considered the bare minimum. Companies like Backblaze claim eleven nines [https://www.backblaze.com/blog/cloud-storage-durability/]. This is considerably more robust than say, that aging spinning disk you have sitting on your shelf. In fact, almost a quarter of all spinning hard drives fail in their first 4 years. Ouch. I completely get the fact that the subscription or “rental” models are a highly divisive subject, and at the end of the day, that’s what the cloud storage model is. But you can’t deny that the cost that you get to spread out over years (also known as Opex, or operating expenditure budget [https://en.wikipedia.org/wiki/Operating_expense]) is a bit more flexible and robust than the one time buy out of storage (known as CapEx, or capital expenditure budget [https://en.wikipedia.org/wiki/Capital_expenditure]). Which brings us to the next point, which is “what are the differences between the various cloud storage options?” Well, that deserves its own 5 THINGS episode, but the 2 main points to know is that the pricing model covers “availability”, or how quickly you can access the storage and read and write from it, and throughput, or how fast you can upload and download to it. Slower storage is cheaper, and normal internet upload and download speeds are in line with what the storage can provide. Fast storage, that is, storage that gives you Gigabits per second for cloud editing with high IOPs can be several hundred dollars a month per useable TB. This is why cloud storage is often used as a transfer medium, or as a backup or archive solution rather than a real-time editing platform. However, with the move to more cloud-based applications, the need for faster storage will be necessary. With private clouds and data lakes popping up all over, the cost for cloud storage will continue to drop, much like the hard drives cost per TB has dropped over the past several years. [https://5thingsseries.com/wp-content/uploads/2019/11/S03E05_5THINGS_hard-drive-and-cloud-storage-costs-compared-1024x576.png] [https://5thingsseries.com/wp-content/uploads/2019/11/S03E05_5THINGS_hard-drive-and-cloud-storage-costs-compared.png] Cloud storage also has the added benefit of allowing work outside of your office and collaborating in real-time without having to be within the 4 walls of your company. Often, high-end firewalls and security, are, well, highly-priced, and your company may not have that infrastructure…or the I.T. talent to take on such an endeavor. Relying on the cloud for that security is built into your monthly price. Plus, most security breaches or hacks are due to human error or social engineering, not a fault in the security itself. Cloud storage also abstracts the physical location of your stored content from your business, making unauthorized access and physical attacks that much harder. 3. RENDERING AND TRANSCODING AND VFX The next logical step in utilizing cloud resources is to offload the heavy lifting of your project that requires Flopping Power. The smart folks working in animation and VFX have been doing this for years. Rendering 100,000 frames (about an hour’s worth of material, depending on your frame rate) across hundreds or thousands of processors is gonna be finished much faster than across the handful of processors you have locally. It’s also a hellova lot cheaper to spin up machines as needed in the cloud then buying the horsepower outright for your suite. Before you begin, you need to determine what you’re creating your models in and if cloud rendering is even an option. Typical creative environments that support cloud rendering workflows include tools like 3DS Max, Maya, Houdini, among others. Next is identifying the CSP – cloud service provider –in this case, the big 3: Microsoft Azure, Amazon Web Services, or Google Cloud that supports a render farm in the cloud. [https://5thingsseries.com/wp-content/uploads/2019/11/S03E05_5THINGS_VPN-1-1024x576.png] [https://5thingsseries.com/wp-content/uploads/2019/11/S03E05_5THINGS_VPN-1.png]Once you have your CSP selected, a user establishes a secure connection to that CSP, usually via a VPN, or virtual private network [https://en.wikipedia.org/wiki/Virtual_private_network]. A VPN adds an encrypted layer of security between your machine and the CSP. It allows provides a direct pipe to send and receive data to your local machines and your CSP. From here, a queuing and render management software is needed. This is what schedules the renders across multiple machines and ensures each machine is getting the data it needs to crunch in the most efficient way possible, plus recombining the rendered chunks back together. Deadline [https://www.awsthinkbox.com/deadline] and Tractor [https://renderman.pixar.com/tractor] are popular options. What this software also does is orchestrate media movement between on-premises, the storage staging area before the render, and where the rendered media ends up. Next, the render farm machines run specialized software to render your chosen sequence. This can be V-Ray [https://www.chaosgroup.com/], Arnold [https://www.arnoldrenderer.com/], RenderMan [https://renderman.pixar.com/] among many others. Once these frames are rendered and added back to the collective sequence, the file is delivered. I know, this can get daunting, which is why productions traditionally have a VFX or Animation Pipeline Developer. They devise and optimize the workflows so costs are kept down, but the deadlines are hit. This hybrid methodology obviously blends creation and artistry on-premises, with the heavy lifting done in the cloud. However, there is a more all-in-one solution, and that’s doing *everything* in the cloud. The VFX artist works with a virtual machine in the cloud, which has all of the flopping powerimmediately available. The application and media are directly connected to the virtual machine. Companies like BeBop Technology [http://beboptechnology.com] have been doing this with apps like Blender [https://www.blender.org/features/vfx/], Maya [https://www.autodesk.com/products/maya/overview], 3DS Max [https://www.autodesk.com/products/3ds-max/overview], After Effects [https://www.adobe.com/products/aftereffects.html], and more. DISCLAIMER: I work for BeBop [https://beboptechnology.com/bebop-expands-management-team-with-addition-of-bonini-cooper-kammes/] because I love their tech. Transcoding, on the other hand, is a much more common way of using the horsepower of the cloud. As an example, ever seen the “processing” message on YouTube? Yeah, that’s YouTube transcoding the files you’ve uploaded to various quality formats. How this can be beneficial for you are for your deliverables. In today’s VOD landscape, creating multiple formats for various outlets is commonplace. Each VOD provider has the formats they prefer and are often not shy about rejecting your file. Don’t take it personally, often their playout and delivery systems function based on the files they receive being in a particular and exact format. As an example, check out Netflix requirements. [https://partnerhelp.netflixstudios.com/hc/en-us/categories/202282037-SPECIFICATIONS-GUIDES] The hitch here is metadata. Just using flopping power to flip the file doesn’t deliver all of the ancillary data that more and more outlets want. This can be captioning, various languages or alt angles, descriptive text, color information and more. Metadata resides in different locations within the file, whether it be an MP4, MOV, MXF, IMF, or any other the other container formats. Many outlets also ask for specialized sidecar XML files. I cannot overstate how important this metadata mapping is, and how often this is overlooked. You may wanna check out AWS Elastic Transcoder [https://aws.amazon.com/elastictranscoder/], which makes it pretty easy to not only flip files…but also do real-time transcoding if you’re into that sorta thing. Telestream [http://www.telestream.net/] also has its Vantage [http://www.telestream.net/vantage/overview.htm] software in the cloud which adds things like Quality Control and speech to text functions. There are also specialty transcoding services, like Pixel strings by Cinnafilm [https://pixelstrings.com/] for those tricky frame rate conversions, high quality retiming, and creating newer formats like IMF packaging [https://pixelstrings.com/support/documents/imf-101/]. 4. VIDEO EDITING, AUDIO EDITING, FINISHING, AND GRADING Audio and video editing, let alone audio mixing and video grading and finishing, are the holy grail for cloud computing in Post Production. Namely, because these processes require human interaction at every step. Add an edit, a keyframe, or a fader touch all require the user to have constant and repeatable communication with the creative tool. Cloud computing, if not done properly, can add unacceptable latency, as the user needs to wait for the keypress locally to be reflected remotely. This can be infuriating for creatives. A tenth of a second can mean the difference between creativity and…carnage. There are a few ways to tackle editing when not all of the hardware, software, or media is local to you…and sometimes you can use multiple approaches together for a hybrid approach. First, we have the private cloud, which can be your own little data center, serving up the media as live proxy streams to a remote creative with a typical editing machine. True remote editing. [https://5thingsseries.com/wp-content/uploads/2019/11/S03E05_5THINGS_remote-streamed-editing-1024x576.png] [https://5thingsseries.com/wp-content/uploads/2019/11/S03E05_5THINGS_remote-streamed-editing.png] Next, have the all-in approach – have everything, and I mean absolutely everything – is virtualized in the cloud. The software application, the storage, and you access it all through a basic computer or what we call zero clients. [https://5thingsseries.com/wp-content/uploads/2019/11/S03E05_5THINGS_cloud-editing-with-VMs-1024x576.png] [https://5thingsseries.com/wp-content/uploads/2019/11/S03E05_5THINGS_cloud-editing-with-VMs.png] Lastly, we have a hybrid approach. Serve the media up in the cloud to a watered-down web page based editor on your local machine. [https://5thingsseries.com/wp-content/uploads/2019/11/S03E05_5THINGS_remote-editing-in-a-web-broswer-1024x576.png] [https://5thingsseries.com/wp-content/uploads/2019/11/S03E05_5THINGS_remote-editing-in-a-web-broswer.png] Each has its pros and cons. Both Avid [http://www.avid.com] and Adobe [http://www.adobe.com] have had versions of an on-premises server serving up proxies to remote editing systems for many years. The on-prem server – a private cloud, for all intents and purposes – serves out proxy streams of media for use natively within an Avid Media Composer or Adobe Premiere Pro system connected remotely. Adobe called it Adobe Anywhere [https://5thingsseries.com/adobe-anywhere/], and today the application is…nowhere. The expensive product was shelved after a few years. Avid, however, is still doing this today [https://www.avid.com/solutions/cloud], using a mix of many Avid solutions, including the product formerly known as Interplay, now called Media Central | Cloud UX [https://www.avid.com/products/mediacentral/mediacentral-cloud-ux], a few add on modules, along with a Media Composer | Cloud Remote license [https://www.avid.com/products/media-composer-cloud-remote]. It’s expensive, usually over $100K. Back to Adobe, I’d be remiss if I didn’t mention 3rd party asset management systems that carry on the Adobe Anywhere approach. Solutions like VPMS Editmate from Arvato Bertelsmann [https://us.arvato-systems.com/arvato-systems-us/industries/industries-overview/media-entertainment/video-production-management-suite/vpms-editmate], or Curator from IPV [https://www.ipv.com/product/#curator-for-adobe] are options but are based around their enterprise asset management systems, so don’t expect the price tag to be anything but enterprise. The all-in cloud approach, meaning your NLE and all of the supporting software tools and hardware storage – are running in a VM – a virtual machine – in a nearby data center. This brings you the best of both worlds. Your local machine is simply a window into the cloud-hosted VM, which brings you all the benefits of the cloud, presented in a familiar way – a computer desktop. And you don’t have the expensive internal infrastructure to manage. This is tricky though, as creatives need low latency, and geographical distance can be challenging if not done right. A few companies are accomplishing this, however, using robust screenshare protocols and nearby data centers. Avid has Media Composer and NEXIS running on Azure and will be available with Avid’s new ”Edit on Demand” product [https://www.avid.com/products/avid-edit-on-demand]. BeBop Technology [http://www.beboptechnology.com], is accomplishing the same thing, but with dozens of editorial and VFX apps, including Avid and Premiere [https://beboptechnology.com/software-we-support/]. Disclaimer: I still work for BeBop. Because their technology is the sh&t. Some companies have investigated a novel approach: why not let creatives work in a web browser to ensure cross-platform availability, and to work without the proprietary nature that all major NLEs inherently have? This is a gutsy approach, as most creatives prefer to work within the tools they’ve become skilled in. However, less intensive creative tasks, like string-outs or pulling selects performed by users who may not be full-time power editors is an option. Avid adds some of this functionality into their newer Editorial Management [https://www.avid.com/products/mediacentral-editorial-management] product. Another popular choice for web browser editing is Blackbird, formerly known as FORScene by Forbidden Technology [https://www.blackbird.video/].  This paradigm is probably the weakest for you pro editors out there. I don’t know about you, but I want to work on the tools I’ve spent years getting better at. Alas, Mac only based apps like Final Cut Pro X [https://www.apple.com/final-cut-pro/]are strictly local/on-premises solutions. And while there are Mac-centric data centers, often the Apple hardware ecosystem limits configuration options compared to PC counterparts.  Most Mac data centers also do not have the infrastructure to provide robust screen sharing protocols to make remote-based Apple editing worthwhile.  Blackmagic’s Resolve, while having remote workflows [https://www.colouristsarinamccavana.com/single-post/2017/08/04/Remote-Grading-in-DaVinci-Resolve], still requires media to be located on both the local and remote systems.  This effectively eliminates any performance benefits found in the cloud. 1 second in both audio and video [https://5thingsseries.com/wp-content/uploads/2019/11/S03E05_5THINGS_audio-samples-vs-film-video-frames-1024x576.png] [https://5thingsseries.com/wp-content/uploads/2019/11/S03E05_5THINGS_audio-samples-vs-film-video-frames.png]1 second in both audio and video Audio, my first love, has some way to go. While basic audio in an NLE can be accomplished with the methods I just outlined, emulating pro post audio tools can be challenging. Audio is usually measured in samples. Audio sampled at 48kHz is actually 48000 individual samples a second. Compare this to 24 to 60 frames a second for video, and you can see why precision is needed when working with audio. This is one reason the big DAW companies don’t yet sanction running their apps in the cloud. Creative work with latency by remote machines at the sample level makes this a clunky and ultimately unrewarding workflow. Pro Tool Cloud [https://www.avid.com/pro-tools/cloud-collaboration] is a sorta hybrid, allowing near real-time collaboration of audio tracks and projects. However, audio processing and editing is still performed locally. On to Finishing and Color Grading in the cloud. Often these tasks take a ton of horsepower. And you’d think the cloud would be great for that! And it will be…someday. These processes usually require the high res or the source media – not proxies. This means the high res media has to be viewed by the finishing or color grade artist. These leaves us with 1 or 2 of unacceptable conditions: 1. Cloud storage that can also play the high res content is prohibitively expensive and 2. There isn’t a way to transmit high res media streams in real-time to be viewed and thus graded without unacceptable visual compression. But NDI [https://en.wikipedia.org/wiki/Network_Device_Interface] you cry! Yes, my tech lover, we’ll cover that in another episode. While remote grading with cloud media is not quite there, remote viewing is a bit more manageable. And we’ll cover that…now. 5. REVIEW AND APPROVE (AND BONUS!) Review and approve is one of the greatest achievements of the internet era for post-production. Leveraging the internet and data centers to house your latest project for feedback is now commonplace. This can be something as simple as pushing to YouTube [http://www.youtube.com] or Vimeo [https://vimeo.com/] or shooting someone a Dropbox link. While this has made collaboration without geographic borders possible, most solutions rely on asynchronous review and approve…that is, you push a file somewhere, someone watches it, then gives feedback. Real-time collaboration, or synchronous review and approve – meaning the creative stakeholders are all watching the same thing and at the same time, is a bit harder to do. As I mentioned earlier, real-time, high-fidelity video streaming can cause artifacts…out of sync audio, reduced frame rates, and all of this can take the user out of the moment. This is where more expensive solutions that are more in line with video conferencing surface, popular examples include Sohonet’s Clearview Flex [https://www.sohonet.com/clearview-flex/],Streambox [https://www.streambox.com/], or the newer Evercast solution [https://www.evercast.us/]. However, In this case, these tools are mostly using the cloud as a point to point transport mechanism, rather than leveraging the horsepower in the cloud. NDI holds a great deal of promise. As I already said, we’ll cover that in another episode. Back to the non-real time, asynchronous review and approve: The compromises with working in an asynchronous fashion are slowly being eroded away by the bells and whistles on top of the basic premise of sharing a file with someone not local to you. Frame.io is dominating in this space, with plug-ins and extensions for access from right within your NLE, a desktop app for fast media transfers, plus their web page review and approval process which is by far the best out there Wiredrive and Kollaborate are other options, also offering a web page review and approve options. I’m also a big fan of having your asset management system tied into an asynchronous review and approval process. This allows permitted folks to see even more content and have any changes or notes tracked within 1 application. Many enterprise DAMs have this functionality. A favorite of mine is CatDV [http://squarebox.com] who has these tools built-in, as well as Akomi by North Shore Automation [https://www.northshoreautomation.com/akomi], which has an even slicker implementation and has the ability to run in the cloud. As a bonus cloud tool, I’m also a big fan of Endcrawl, and online site that generates credit crawls for your projects without the traditional visual jitteriness from your NLE [http://endcrawl.com/blog/why-are-my-end-titles-jittering/], and the inevitable problems of 37 [https://i.pinimg.com/originals/49/a4/40/49a440a24046cea401c8605c1b69d33f.jpg] credit revisions. A heartfelt thank you to everyone who reached out via text or email or shared my last personal video [https://youtu.be/_r6XrMEJ4Sw]. It means more than you know. Until the next episode: learn more, do more. Like early, share often, and don’t forget to subscribe. Thanks for watching. The post An Intro to Using the Cloud for Post Production [https://5thingsseries.com/episode/an-intro-to-using-the-cloud-for-post-production/] appeared first on 5 THINGS - Simplifying Film, TV, and Media Technology [https://5thingsseries.com].

13. nov. 2019 - 20 min
episode Blackmagic eGPU Pro & 2018 Mac Mini vs. FCPX, Adobe Premiere Pro, DaVinci Resolve, and Avid Media Composer artwork
Blackmagic eGPU Pro & 2018 Mac Mini vs. FCPX, Adobe Premiere Pro, DaVinci Resolve, and Avid Media Composer

On this episode of 5 THINGS, we’re checking out the 2018 Mac Mini from Apple [https://www.apple.com/mac-mini/], and the neweGPU Pro offering from Blackmagic [https://www.blackmagicdesign.com/products/blackmagicegpu/] and …well, Apple [https://www.apple.com/shop/product/HMQT2VC/A/blackmagic-egpu-pro]. We’re also running benchmarks against your favorite (or least favorite) NLEs:Final Cut X [https://www.apple.com/final-cut-pro/], DaVinci Resolve [https://www.blackmagicdesign.com/products/davinciresolve/], Adobe Premiere Pro [https://www.adobe.com/products/premiere.html], and Avid Media Composer [https://www.avid.com/media-composer]. Let’s get to it. NOTE: The eGPU Pro in this episode was a pre-release, beta model. Your results may vary with the shipping version. 1. 2018 APPLE MAC MINI What better machine to test how well an external GPU works than on a machine that has a horrible built-in GPU? Yes, tech friends, the powerful top of the line 2018 Mac Mini is built on Intel technology, which means there is a small GPU on the chip. While this GPU, the Intel UHD Graphics 630, isn’t going to break any performance records, it does provide a user with a basic way to feed their screen without a 3rd party graphics card. S03E04 2018 Mac Mini and Blackmagic eGPU Pro_rear connected [http://5thingsseries.com/wp-content/uploads/2018/12/S03E04-2018-Mac-Mini-and-Blackmagic-eGPU-Pro_rear-connected-559x458.png] The Mac Mini also rocks USB-C for Thunderbolt 3 access. It’s this 40Gb connection that provides the bandwidth for an eGPU to shine. Slower I/O, like Thunderbolt 2 or even older connections, simply don’t provide enough bandwidth to accommodate all that a modern GPU can provide. The new Mini provides four USB-C Thunderbolt 3 ports, enough for up to 2 eGPUs. I also dig the legacy USB 3 ports, too…as we all have peripherals like keyboards and mice that still rock the legacy USB type A. Back to throughput, the 2018 Mac Mini also has an option for a 10GigE copper connection. I know, many signify 10GigE as a sign that it’s “for professionals”. Quite frankly, 10GigE is the new 1GigE, so if you haven’t been looking into it, now would be a good time. The Mini has many different options, from an entry level i3 processor to the midrange i5, to the i7 model that l I tested with. The units can be configured with 4 or 6 cores, and speeds of up to 4.6GHz in Turbo Boost Mode. For RAM, the 2018 Mini supports up to 64GB, although you may need to take it to an Apple store to get it installed. You can also build your system out with up to a 2TB PCI SSD. My testing unit was the 6 core i7, with 32GB of RAM. S03E04 2018 Mac Mini and Blackmagic eGPU Pro_mac mini specs [http://5thingsseries.com/wp-content/uploads/2018/12/S03E04-2018-Mac-Mini-and-Blackmagic-eGPU-Pro_mac-mini-specs-548x330.png] [http://5thingsseries.com/wp-content/uploads/2018/12/S03E04-2018-Mac-Mini-and-Blackmagic-eGPU-Pro_mac-mini-specs.png] I ran some benchmarks on the Mini and compared the results with other Geekbench [https://www.geekbench.com/] scores of other popular Apple machines. geekbench 2018 mac mini 2013 mac pro 2017 macbook pro and hackintosh [http://5thingsseries.com/wp-content/uploads/2018/12/geekbench_2018-mac-mini-2013-mac-pro-2017-mabook-pro-and-hackintosh-1024x270.png] [http://5thingsseries.com/wp-content/uploads/2018/12/geekbench_2018-mac-mini-2013-mac-pro-2017-mabook-pro-and-hackintosh.png] Here, we have a top of the line 2013 Mac Pro, and a maxed out 2017 MacBook Pro, plus my Hackintosh [https://5thingsseries.com/building-a-hackintosh/]build from a few episodes back. You can see the 2018 Mac Mini is apparently no slouch when it comes to performance. Plus, it helps that the last Mac Pro is over 5 years old, too. Let that sink in for a minute. Best-Cry-Ever [http://5thingsseries.com/wp-content/uploads/2018/12/Best-Cry-Ever.gif] [http://5thingsseries.com/wp-content/uploads/2018/12/Best-Cry-Ever.gif] During testing, I was alerted to the fact that the Mini will throttle the chip speed when the unit hits 100 degrees (Celsius), so if you push the system too hard, the performance will suffer. S03E04 2018 Mac Mini and Blackmagic eGPU Pro_speed throttling temperature [http://5thingsseries.com/wp-content/uploads/2018/12/S03E04-2018-Mac-Mini-and-Blackmagic-eGPU-Pro_speed-throttling-temperature.png] [http://5thingsseries.com/wp-content/uploads/2018/12/S03E04-2018-Mac-Mini-and-Blackmagic-eGPU-Pro_speed-throttling-temperature.png] That kinda blows. There are a ton of in-depth 2018 Mac Mini reviews out there, and let’s face it, you’re really here for the eGPU Pro and video post-production stuff…so let’s move onto that…now. 2. BLACKMAGIC EGPU PRO Like the non-pro model before it, the eGPU Pro from Blackmagic simplifies the addition of enhanced graphics processing by putting a GPU in an external enclosure. Also, like the previous model, the Pro has an 8GB card in it, albeit with a faster 8GB card than the Radeon Pro 580 in the previous gen. The new GPU is the Radeon RX Vega 56. Both models are meant to capitalize on the Metal playback engine, although they will apparently utilize OpenCL or CUDA.  Given the future of both of those last two standards on the Mac platform….best start getting used to Metal. S03E04 2018 Mac Mini and Blackmagic eGPU Pro_eGPU Pro specs [http://5thingsseries.com/wp-content/uploads/2018/12/S03E04-2018-Mac-Mini-and-Blackmagic-eGPU-Pro_eGPU-Pro-specs-1024x576.png] [http://5thingsseries.com/wp-content/uploads/2018/12/S03E04-2018-Mac-Mini-and-Blackmagic-eGPU-Pro_eGPU-Pro-specs.png] The eGPU Pro also features a Display Port, unlike the non-Pro version. This allows for up to 5K monitoring if you so desire…as the HDMI 2.0 port tops out at 4K DCI at 60fps. Both units come with 4 USB 3.0 ports and a spare USB-C Thunderbolt 3 port so you can connect even more peripherals through the eGPU. Now this sucker is damn quiet. Aside from a small, white LED indicator at the bottom of the unit, and an icon in the menu bar, you wouldn’t even know the unit is running. S03E04 2018 Mac Mini and Blackmagic eGPU Pro_bottom light [http://5thingsseries.com/wp-content/uploads/2018/12/S03E04-2018-Mac-Mini-and-Blackmagic-eGPU-Pro_bottom-light-1024x576.png] [http://5thingsseries.com/wp-content/uploads/2018/12/S03E04-2018-Mac-Mini-and-Blackmagic-eGPU-Pro_bottom-light.png]The Blackmagic eGPU Pro is very quiet. The only physical indicator is a small white LED at the bottom. The unit is a mostly a massive heatsink, so expect heat to come pouring out of the top. The eGPU Pro ships with a 1 foot USB-C cable. This cable is VERY short, which means this unit is always going to be next to your computer. While I didn’t test it, I understand longer USB-C cables can cause issues, so stick with the cable that came with it. S03E04 2018 Mac Mini and Blackmagic eGPU Pro_eGPU size comparison scotch [http://5thingsseries.com/wp-content/uploads/2018/12/S03E04-2018-Mac-Mini-and-Blackmagic-eGPU-Pro_eGPU-size-comparison-scotch-530x559.png] [http://5thingsseries.com/wp-content/uploads/2018/12/S03E04-2018-Mac-Mini-and-Blackmagic-eGPU-Pro_eGPU-size-comparison-scotch.png]It’s imposing. So is the size of the Blackmagic eGPU Pro. While the eGPU Pro is cool looking, it‘s pretty hefty and does take up valuable desktop real estate. It also dwarfs gear next to it, like the Mac Mini. In fact, it’s larger than a 2013 Mac Pro, among other….things. What is uniquely different, however, is that the eGPU Pro is only certified to work with Mac OS 10.14 Mojave. I like this, as support in OS 10.13 High Sierra for eGPUs was a crapshoot at best. Interestingly enough, nothing needs to be installed in Mojave for the eGPU Pro to run – the GPU drivers are built into the OS. In fact, the eGPU Pro comes with no drivers or software to install. I tested with a pre-shipping model of the Pro –in fact, the packaging was still for the non-pro version. As such, there was no documentation or quick start guide with the unit. The ship date for the eGPU Pro has now slipped a month from November 2018 to December 2018; perhaps code is still being written for Resolve and FCPX, as I did find some issues. Let’s check that out now. 3. APPLE FINAL CUT PRO X AND BLACKMAGIC DAVINCI RESOLVE Please check out the video for specifics! S03E04 2018 Mac Mini and Blackmagic eGPU Pro_fcpx render performance with and without egpu [http://5thingsseries.com/wp-content/uploads/2018/12/S03E04-2018-Mac-Mini-and-Blackmagic-eGPU-Pro_fcpx-render-performance-with-and-without-egpu-559x314.png] [http://5thingsseries.com/wp-content/uploads/2018/12/S03E04-2018-Mac-Mini-and-Blackmagic-eGPU-Pro_fcpx-render-performance-with-and-without-egpu.png][https://5thingsseries.com/wp-content/uploads/2018/12/S03E04-2018-Mac-Mini-and-Blackmagic-eGPU-Pro_compressor-export-performance-with-and-without-egpu-559x314.png] [http://5thingsseries.com/wp-content/uploads/2018/12/S03E04-2018-Mac-Mini-and-Blackmagic-eGPU-Pro_compressor-export-performance-with-and-without-egpu.png] [https://5thingsseries.com/wp-content/uploads/2018/12/S03E04-2018-Mac-Mini-and-Blackmagic-eGPU-Pro_fcpx-stream-count-and-type-559x314.png] [http://5thingsseries.com/wp-content/uploads/2018/12/S03E04-2018-Mac-Mini-and-Blackmagic-eGPU-Pro_fcpx-stream-count-and-type.png] S03E04 2018 Mac Mini and Blackmagic eGPU Pro_resolve render performance with and without egpu [http://5thingsseries.com/wp-content/uploads/2018/12/S03E04-2018-Mac-Mini-and-Blackmagic-eGPU-Pro_resolve-render-performance-with-and-without-egpu-559x314.png] [http://5thingsseries.com/wp-content/uploads/2018/12/S03E04-2018-Mac-Mini-and-Blackmagic-eGPU-Pro_resolve-render-performance-with-and-without-egpu.png] S03E04 2018 Mac Mini and Blackmagic eGPU Pro_resolve export performance with and without egpu [http://5thingsseries.com/wp-content/uploads/2018/12/S03E04-2018-Mac-Mini-and-Blackmagic-eGPU-Pro_resolve-export-performance-with-and-without-egpu-559x314.png] [http://5thingsseries.com/wp-content/uploads/2018/12/S03E04-2018-Mac-Mini-and-Blackmagic-eGPU-Pro_resolve-export-performance-with-and-without-egpu.png] S03E04 2018 Mac Mini and Blackmagic eGPU Pro_resolve power grades by jason bowdach [http://5thingsseries.com/wp-content/uploads/2018/12/S03E04-2018-Mac-Mini-and-Blackmagic-eGPU-Pro_resolve-power-grades-by-jason-bowdach-559x285.png] [http://5thingsseries.com/wp-content/uploads/2018/12/S03E04-2018-Mac-Mini-and-Blackmagic-eGPU-Pro_resolve-power-grades-by-jason-bowdach.png]Powergrades by Jason Bowdach to bring the Resolve system to it’s knees. S03E04 2018 Mac Mini and Blackmagic eGPU Pro_resolve stream count and type with and without egpu [http://5thingsseries.com/wp-content/uploads/2018/12/S03E04-2018-Mac-Mini-and-Blackmagic-eGPU-Pro_resolve-stream-count-and-type-with-and-without-egpu-559x314.png] [http://5thingsseries.com/wp-content/uploads/2018/12/S03E04-2018-Mac-Mini-and-Blackmagic-eGPU-Pro_resolve-stream-count-and-type-with-and-without-egpu.png] 4. ADOBE PREMIERE PRO AND AVID MEDIA COMPOSER Please check out the video for specifics! S03E04 2018 Mac Mini and Blackmagic eGPU Pro_adobe premiere pro render performance with and without egpu [http://5thingsseries.com/wp-content/uploads/2018/12/S03E04-2018-Mac-Mini-and-Blackmagic-eGPU-Pro_adobe-premiere-pro-render-performance-with-and-without-egpu-559x314.png] [http://5thingsseries.com/wp-content/uploads/2018/12/S03E04-2018-Mac-Mini-and-Blackmagic-eGPU-Pro_adobe-premiere-pro-render-performance-with-and-without-egpu.png] S03E04 2018 Mac Mini and Blackmagic eGPU Pro_adobe media encoder export performance with and without egpu [http://5thingsseries.com/wp-content/uploads/2018/12/S03E04-2018-Mac-Mini-and-Blackmagic-eGPU-Pro_adobe-media-encoder-export-performance-with-and-without-egpu-559x314.png] [http://5thingsseries.com/wp-content/uploads/2018/12/S03E04-2018-Mac-Mini-and-Blackmagic-eGPU-Pro_adobe-media-encoder-export-performance-with-and-without-egpu.png] S03E04 2018 Mac Mini and Blackmagic eGPU Pro_adobe premiere pro stream count and type [http://5thingsseries.com/wp-content/uploads/2018/12/S03E04-2018-Mac-Mini-and-Blackmagic-eGPU-Pro_adobe-premiere-pro-stream-count-and-type-559x314.png] [http://5thingsseries.com/wp-content/uploads/2018/12/S03E04-2018-Mac-Mini-and-Blackmagic-eGPU-Pro_adobe-premiere-pro-stream-count-and-type.png] S03E04 2018 Mac Mini and Blackmagic eGPU Pro_avid media composer mac mini gpu warning [http://5thingsseries.com/wp-content/uploads/2018/12/S03E04-2018-Mac-Mini-and-Blackmagic-eGPU-Pro_avid-media-composer-mac-mini-gpu-warning-559x511.png] [http://5thingsseries.com/wp-content/uploads/2018/12/S03E04-2018-Mac-Mini-and-Blackmagic-eGPU-Pro_avid-media-composer-mac-mini-gpu-warning.png]Avid Media Composer disables all GPU effects as the Mac Mini’s onboard GPU fails to meet Media Composer’s requirements. S03E04 2018 Mac Mini and Blackmagic eGPU Pro_avid media composer render performance with and without egpu [http://5thingsseries.com/wp-content/uploads/2018/12/S03E04-2018-Mac-Mini-and-Blackmagic-eGPU-Pro_avid-media-composer-render-performance-with-and-without-egpu-559x314.png] [http://5thingsseries.com/wp-content/uploads/2018/12/S03E04-2018-Mac-Mini-and-Blackmagic-eGPU-Pro_avid-media-composer-render-performance-with-and-without-egpu.png] S03E04 2018 Mac Mini and Blackmagic eGPU Pro_avid export performance with and without egpu [http://5thingsseries.com/wp-content/uploads/2018/12/S03E04-2018-Mac-Mini-and-Blackmagic-eGPU-Pro_avid-export-performance-with-and-without-egpu-559x314.png] [http://5thingsseries.com/wp-content/uploads/2018/12/S03E04-2018-Mac-Mini-and-Blackmagic-eGPU-Pro_avid-export-performance-with-and-without-egpu.png] S03E04 2018 Mac Mini and Blackmagic eGPU Pro_avid media composer stream count and type [http://5thingsseries.com/wp-content/uploads/2018/12/S03E04-2018-Mac-Mini-and-Blackmagic-eGPU-Pro_avid-media-composer-stream-count-and-type-559x314.png] [http://5thingsseries.com/wp-content/uploads/2018/12/S03E04-2018-Mac-Mini-and-Blackmagic-eGPU-Pro_avid-media-composer-stream-count-and-type.png] 5. CONCLUSIONS Lots of data to crunch! It’s obvious that despite the initial high benchmarks via Geekbench, the 2018 Mac Mini, on its own, doesn’t really have a fit for creative uses in Post, unless it’s used as a file server, or perhaps a lone Mac in a sea of Windows machines when you need an easy way to create ProRes files. I concede that it may work for story producers or preditors, but I find that it’s never a good practice to hit the upper limits of a new machine from day 1. I should also note that the 2018 Mac Mini is not qualified by all NLE manufacturers. Avid hasn’t qualified [http://avid.force.com/pkb/articles/en_US/compatibility/en422411] it as of now. avid qualified systems 11/2018 [http://5thingsseries.com/wp-content/uploads/2018/12/avid-qualified-systems-548x419.png] [http://5thingsseries.com/wp-content/uploads/2018/12/avid-qualified-systems.png] It doesn’t meet the minimum specs that Adobe publishes [https://helpx.adobe.com/premiere-pro/system-requirements.html]. adobe premiere pro qualified systems 11/2018 [http://5thingsseries.com/wp-content/uploads/2018/12/adobe-qualified-systems-548x352.png] [http://5thingsseries.com/wp-content/uploads/2018/12/adobe-qualified-systems.png] The Blackmagic DaVinci Resolve Configuration Guide [https://documents.blackmagicdesign.com/ConfigGuides/DaVinciResolve15/20180407-79c607/DaVinci_Resolve_15_Configuration_Guide.pdf]calls out specifically that the Mini should not be used. blackmagic resolve 15 mac mini warning 11/2018 [http://5thingsseries.com/wp-content/uploads/2018/12/resolve-requirements-1-548x263.png] [http://5thingsseries.com/wp-content/uploads/2018/12/resolve-requirements-1.png] Apple’s requirements for Final Cut Pro X [https://www.apple.com/final-cut-pro/specs/], on the other hand, are barely met by the Mini. apple fcpx system requirements 11/2018 [http://5thingsseries.com/wp-content/uploads/2018/12/fcpx-system-requirements-548x400.png] [http://5thingsseries.com/wp-content/uploads/2018/12/fcpx-system-requirements.png] Personally, I’d save the $2000 this machine costs and put that towards an iMac or iMac Pro, or perhaps the highly anticipated 2019 Mac Pro. Now, as for the eGPU, it’s important to remember that all of this is new. This was essentially a science experiment. Mojave, as of this episode, is only at 10.14.1. Apple, one of the partners in this eGPU collaboration, still doesn’t have all of the bugs worked out, as Compressor doesn’t appear to utilize it. That being said, speed increases in FCPX and Resolve are undeniable and certainly showcase the speed a good GPU can bring to the post table. Adobe, while not part of the Apple/Blackmagic eGPU soiree, also shows speed benefits out of the box – and that’s before Adobe has done any optimization for it. That’s pretty impressive. Avid, on the other hand, has never been a GPU powerhouse. My tests only really showed that Avid has some serious engineering to do to incorporate eGPUs into future releases. If this eGPU Pro solution could breathe life into an old system, I might be more inclined to suggest it…however, the fact that you need a relatively recent Thunderbolt 3 enabled machine makes that “old system” label inappropriate. Now, you can ‘roll your own’ eGPU, by using a 3rd party external chassis and compatible graphics cards. They can be parted out and purchased for under $800 on the street…$400 cheaper than the eGPU Pro. While I don’t expect an all in one solution to be the same cost as a DIY, a $400 delta is too big of a chunk to ignore. The only real time I can suggest a solution like this is for those folks who edit on a MacBook or a MacBook Pro. Those road warriors who are on the go and editing but need to come back home base at some point and do some serious heavy lifting. But is that market large enough? Gaming and other GPU enabled applications have a much wider reach than the Post community. Regardless, I’m excited to see if in 12-18 months NLEs have been able to start utilizing eGPUs better than they do from day 1. Have more Pro eGPU and Mac Mini questions other than just these 5? Ask me in the Comments section. Until the next episode: learn more, do more. Like early, share often, and don’t forget to subscribe. Thanks for watching. * Evaluation gear provided by Michael Horton / LACPUG [http://www.lafcpug.org/] * Resolve power grades by Jason Bowdach [http://www.cineticstudios.com/] * RED Footage courtesy RED.com [http://RED.com] * XAVC Footage courtesy Alex Pasquini / alexpasquini.com [http://alexpasquini.com] * Cut Trance – Cephelopod by Kevin MacLeod is licensed under a Creative Commons Attribution license (https://creativecommons.org/licenses/by/4.0/ [https://creativecommons.org/licenses/by/4.0/]) Source: http://incompetech.com/music/royalty-free/index.html?isrc=USUAN1100273 [http://incompetech.com/music/royalty-free/index.html?isrc=USUAN1100273] Artist: http://incompetech.com/ [http://incompetech.com/] The post Blackmagic eGPU Pro & 2018 Mac Mini vs. FCPX, Adobe Premiere Pro, DaVinci Resolve, and Avid Media Composer [https://5thingsseries.com/episode/blackmagic-egpu-pro-and-2018-mac-mini-vs-fcpx-adobe-premiere-pro-davinci-resolve-and-avid-media-composer/] appeared first on 5 THINGS - Simplifying Film, TV, and Media Technology [https://5thingsseries.com].

09. dec. 2018 - 24 min
episode YouTube Tips And Tricks For Your Media artwork
YouTube Tips And Tricks For Your Media

On this episode of 5 THINGS, I’ve got a few tricks that you may not know about that will help you upload, manage, and make YouTube [http://www.youtube.com] do your bidding. 1. UPLOAD TRICKS You probably have media on YouTube, and you probably think that after thousands of hours, you’ve mastered the ‘Tube. But there are some little-known upload tricks and workarounds, playback shortcuts, and voodoo that you may not even know that YouTube does. Let’s start at the beginning. Let’s say you’ve got a masterpiece of a video, maybe it’s an uncle getting softballed in the crotch, maybe it’s your buddy taking a header down a flight of stairs, or maybe, just maybe it’s the cutest pet in the world. MINE. Lucy: the cutest, bestest pet in the world. [http://5thingsseries.com/wp-content/uploads/2018/09/5-THINGS-YouTube-Tips-Tricks-For-Your-Media-Lucy.gif] [http://5thingsseries.com/wp-content/uploads/2018/09/5-THINGS-YouTube-Tips-Tricks-For-Your-Media-Lucy.gif]Lucy: the cutest, bestest pet in the world. …and the world needs to see it, right? And you export a totally-worth-the-hard-drive-space-huge-monster-master-file so you don’t lose 1 bit of quality. And that’s OK. Now, in actuality, it’s not the most efficient, but let’s save that existential technology discussion for another episode. Did you know that YouTube will actually take this massive file? That you don’t need to shrink it, recompress it, or otherwise use the presets in your transcoder du jour? YouTube used to publish a page that specified video file sizes and data rates for “enterprise” grade connections. Ostensibly, this was so companies with fat internet connections could upload massive files. After all, YouTube re-compresses all files anyway. Yes, as I’ve said many times before, YouTube will ALWAYS recompress your media.ALWAYS. YouTube upload connection speed old 20141026_09h29m56s_001__zps99f4eceb_clean [http://5thingsseries.com/wp-content/uploads/2018/09/YouTube-upload-connection-speed-old-20141026_09h29m56s_001__zps99f4eceb_clean.png] [http://5thingsseries.com/wp-content/uploads/2018/09/YouTube-upload-connection-speed-old-20141026_09h29m56s_001__zps99f4eceb_clean.png]“Enterprise” video bitrates for YouTube. These are 5-6x LARGER than what YouTube recommends for typical users [https://support.google.com/youtube/answer/1722171?hl=en]. But, this page was taken down. Why? Because accepting larger files sizes ties up YouTube’s servers, and takes longer for their transcoding computers to chomp through and subsequently create the versions you’ll end up watching. Plus, it’s a crappy experience for you, the end user, to wait hours for an upload AND the processing. Despite this, you can still do it.  In the above video,  you can see I have an HD ProRes file, and it’s several GB. As I select the file and start the upload, YouTube tells me this will take several hours. That sucks. However, uploading a less compressed file means the versions that YouTube creates will be based on higher quality than the compressed version you’d normally export and upload from your video editor or one that you’d create from your media encoder. 5-THINGS-S03E03-YouTube-Tips-and-Tricks-For-Your-Media-youtube-quality-versions [http://5thingsseries.com/wp-content/uploads/2018/09/5-THINGS-S03E03-YouTube-Tips-and-Tricks-For-Your-Media-youtube-quality-versions-202x559.jpg] [http://5thingsseries.com/wp-content/uploads/2018/09/5-THINGS-S03E03-YouTube-Tips-and-Tricks-For-Your-Media-youtube-quality-versions.jpg]YouTube creates all of your media variants…so you don’t have to. So what would you rather have…a copy of a copy of your finished piece? Or just 1 copy? The less compression you use, the better looking the final video will be. More on this here: A YouTube user uploads, downloads and [https://gizmodo.com/5555359/the-weirdness-of-a-youtube-video-re-uploaded-1000-times]re-uploads the same video to YouTube 1000 times and the compression is staggering. This is called “generational loss”. [https://en.wikipedia.org/wiki/Generation_loss] By the way, you know that YouTube creates all of your various versions, right? From the file you upload, YouTube creates the 240, 360, 480, 720, 1080 and higher versions. Are yours not showing up? Have patience. YouTube takes a little bit. OK, back to the high res fat files. I know, the fact there is a long upload for these large files sucks, but it does lead me to the next tip. Did you know you can resume a broken upload [https://support.google.com/youtube/answer/4525858]?  As long as you have the same file name, you can resume an upload for 24 hours after a failed upload. In the video above, let me show you. Here’s the file I was uploading before. As you can see, it still has quite a way to go. Oops, closed my tab. Now, I’m gonna wait a little bit. OK, it’s been several hours, and I’m going to open a new tab and navigate back to the upload page. Now, I’ll reselect the same file. The upload picks up where it left off, and the file continues to upload. Great for when the internet goes out. 2. CAPTIONING HACKS I’ve been a massive proponent for video captioning for years. Mainly for making tech information more accessible to as many people as possible, but also because it does boost the visibility of your videos via SEO, or Search Engine Optimization [https://en.wikipedia.org/wiki/Search_engine_optimization]. Now, engines like Google not only index the tags you assign to the video but also the objective content that’s heard in the video…what’s being said. 5-THINGS-S03E03-YouTube-Tips-and-Tricks-For-Your-Media-youtube-captions-download [http://5thingsseries.com/wp-content/uploads/2018/09/5-THINGS-S03E03-YouTube-Tips-and-Tricks-For-Your-Media-youtube-captions-download-285x559.png] [http://5thingsseries.com/wp-content/uploads/2018/09/5-THINGS-S03E03-YouTube-Tips-and-Tricks-For-Your-Media-youtube-captions-download.png] Now, YouTube’s captioning is not foolproof. If the audio recording quality is poor, or if there are a lot of slang terms or names, the auto-captions may be wrong. However, the auto-captions are a great start, because not only do they attempt to get stuff right, but they also TIME the captions for you. After YouTube does it’s best to auto-caption, you now can either use YouTube’s built-in transcript and caption editor to fix mistakes and further stylize the timing, but you can also download that same transcript and use it in your NLE, so you can have a caption file on any other platform you like. Many other professional VOD outlets or platforms accept closed captioning as embedded captions, or even as sidecar files [https://en.wikipedia.org/wiki/Sidecar_file]. This also means you can turn the text of your transcript into an accompanying blog post. In fact, that is what I do for this web series and podcast. Another hack is that if you have scripted content or a teleprompter script, you can upload that to YouTube as well. YouTube will then time the script you have against the video you’ve uploaded, and convert the script into captions…which once again, you can download and use. As a side note, I have a tutorial on how to accomplish this very thing, turning your script into timed captions using Youtube [http://michaelkammes.com/workflow/get-your-scripted-content-timed-for-captions-using-youtube/]…check it out. It’s not only a massive time saver but also a way to boost your SEO ranking and give you content for a webpage. 3. PLAYBACK SHORTCUTS Keyboard shortcuts are your friend. The mouse is wholly inefficient and only serves to slow you down. Have no fear, the smart folks at YouTube have incorporated many of the shortcuts found in your NLE into YouTube. Let’s check some of ‘em out. Let’s start with the old reliable “J”, “K”, “L”, or rewind, play/pause, and fast forward. J = rewinds 10 seconds. K = play/pause. L = forwards 10 seconds. What about Spacebar for play/pause? You betcha. My favorites are frame by frame playback, especially when you’re trying to deconstruct that brand new movie trailer that just dropped, or trying to emulate and reverse engineer an effect from someone else’s video. For that, use the comma key to nudge 1 frame at a time in reverse. Conversely, you can hit the period key to do a 1 frame advance. You can also add the shift key to adjust the playback globally. Shift + the period key speed playback up 25%, and hit it again to boost playback to 50% faster. The reverse is also true for the shift + comma key combo. 75% speed, 50% speed, and so on. Here’s a good list for you to earn some muscle memory for. [https://support.google.com/youtube/answer/7631406?hl=en] youtube keyboard shortcuts edit [http://5thingsseries.com/wp-content/uploads/2018/09/youtube-keyboard-shortcuts-edit-713x1024.png] [http://5thingsseries.com/wp-content/uploads/2018/09/youtube-keyboard-shortcuts-edit.png] Now, if you spent any amount of time on Netflix [http://www.netflix.com], you’ve probably stumbled across this window, too. 5-THINGS-S03E03-YouTube-Tips-and-Tricks-For-Your-Media-netflix-basic-stats [http://5thingsseries.com/wp-content/uploads/2018/09/5-THINGS-S03E03-YouTube-Tips-and-Tricks-For-Your-Media-netflix-basic-stats-559x145.jpg] [http://5thingsseries.com/wp-content/uploads/2018/09/5-THINGS-S03E03-YouTube-Tips-and-Tricks-For-Your-Media-netflix-basic-stats.jpg] Or come across this substantially more involved window: A/V Stats. 5-THINGS-S03E03-YouTube-Tips-and-Tricks-For-Your-Media-netflix-advanced-stats [http://5thingsseries.com/wp-content/uploads/2018/09/5-THINGS-S03E03-YouTube-Tips-and-Tricks-For-Your-Media-netflix-advanced-stats-1024x466.jpg] [http://5thingsseries.com/wp-content/uploads/2018/09/5-THINGS-S03E03-YouTube-Tips-and-Tricks-For-Your-Media-netflix-advanced-stats.jpg]Advanced Netflix Stats: Display A/V Stats (Ctrl + Shift + Alt + D) YouTube has something very similar; a handy-dandy “stats for nerds” window.  When playing back a YouTube video, right click on the player window. From this menu, select “Stats for Nerds”. 5-THINGS-S03E03-YouTube-Tips-and-Tricks-For-Your-Media-stats-for-nerds [http://5thingsseries.com/wp-content/uploads/2018/09/5-THINGS-S03E03-YouTube-Tips-and-Tricks-For-Your-Media-stats-for-nerds-1024x343.jpg] [http://5thingsseries.com/wp-content/uploads/2018/09/5-THINGS-S03E03-YouTube-Tips-and-Tricks-For-Your-Media-stats-for-nerds.jpg]Right click on the player window to load YouTube’s “Stats for Nerds”. An overlay window will pop up in the upper left showing you sorts of nerd goodness. This may pacify your need for numbers, but it’s also a great troubleshooting tool for playback issues. You may notice in this overlay the heading “Volume / Normalized”. This is something very sneaky which will we’ll go into….now. 4. AUDIO SECRETS A little-known secret is that not only does YouTube always re-compresses your video, it also diddles your audio work. And I don’t mean just a simple transcode, I’m talking about adjusting your audio levels and sonic dynamics. More specifically, Loudness Equalization. You can see this in the last tip, when we checked out stats for nerds, this is what the “Volume/Normalized” heading is. YouTube analyses the audio present in your video and makes a determination if the volume at certain places – or overall – in your video needs to be adjusted for a pleasurable level of playback for the viewer. 5-THINGS-S03E03-YouTube-Tips-and-Tricks-For-Your-Media-stats-for-nerds-audio [http://5thingsseries.com/wp-content/uploads/2018/09/5-THINGS-S03E03-YouTube-Tips-and-Tricks-For-Your-Media-stats-for-nerds-audio-1024x466.jpg] [http://5thingsseries.com/wp-content/uploads/2018/09/5-THINGS-S03E03-YouTube-Tips-and-Tricks-For-Your-Media-stats-for-nerds-audio.jpg]YouTube is messing with your audio! In this example, you can see the first percentage is the volume in your YouTube player window, the second percentage is how much normalization adjustment is being applied by YouTube; or how much of the volume is being reduced. The last number is the “content loudness” value, which is YouTube’s estimate of the loudness level. Now why on Earth would YouTube do this? It goes back to something I mentioned earlier, and that’s user experience. If you had to constantly change the volume on your headphones, your computer speakers, or your TV for every video you watched, you’d get pretty annoyed, right? It’s also quite dangerous to have one video that’s very quiet, so you turn the volume up, only to have your ears blown off when the next video that plays is super loud. A variant of this is actually part of the CALM Act [https://www.fcc.gov/media/policy/loud-commercials], which holds broadcasters accountable for the same thing. You’re probably asking, “how can I stop YouTube from doing this to my mix?” Well, just like avoiding YouTube from re-compressing your video… you can’t! However, what you can do is minimize the audio tweaking YouTube does so as much of your artistic decision on loudness is retained. You can start by metering and mixing your audio using the Loudness Units relative to Full Scale [https://en.wikipedia.org/wiki/LKFS], otherwise known as LUFS. YouTube doesn’t release exact values, and reports differ, but YouTube tends to shoot for -12 to -14 LUFS, so try and keep your audio in that ballpark to reduce the amount of tinkering that YouTube does. That being said, aiming for a specific integrated loudness doesn’t work, reliably. Experiment and see what sounds good to your ears. 5-THINGS-S03E03-YouTube-Tips-and-Tricks-For-Your-Media_LUFS [http://5thingsseries.com/wp-content/uploads/2018/09/5-THINGS-S03E03-YouTube-Tips-and-Tricks-For-Your-Media_LUFS-1024x685.jpg] [http://5thingsseries.com/wp-content/uploads/2018/09/5-THINGS-S03E03-YouTube-Tips-and-Tricks-For-Your-Media_LUFS.jpg]LUFS metering with Izotope’s Insight. [https://www.izotope.com/en/products/mix/insight.html] Further reading on YouTube, audio compression, and LUFS: * https://productforums.google.com/forum/#!topic/youtube/JQouU5gi1ZA [https://productforums.google.com/forum/#!topic/youtube/JQouU5gi1ZA] * http://productionadvice.co.uk/youtube-loudness/ [http://productionadvice.co.uk/youtube-loudness/] * http://productionadvice.co.uk/stats-for-nerds/ [http://productionadvice.co.uk/stats-for-nerds/] * http://productionadvice.co.uk/how-loud [http://productionadvice.co.uk/how-loud] * http://swt.audio/maximising-audio-for-youtube/ [http://swt.audio/maximising-audio-for-youtube/] 5. ADVERTISING TIPS Now, there may be a point in time when you decide to do some advertising, or perhaps make recommendations to your friends or clients on advertising. This is where Adwords [https://adwords.google.com/] can go from simple to difficult very quickly. But what if I told you about hyper-focused marketing techniques that are not too difficult? Let’s assume you know the basics of Adwords. At a very, very high level, of the kinds of AdWords you can create, are an in-stream video ad, either skippable or not, before the start of a video or, create an in-display ad, which appears as a suggested video to the right in the YouTube sidebar of the video you’re currently watching. youtube-trueview-video-ad-types-1024x564 [http://5thingsseries.com/wp-content/uploads/2018/09/youtube-trueview-video-ad-types-1024x564-1-559x308.jpg] [http://5thingsseries.com/wp-content/uploads/2018/09/youtube-trueview-video-ad-types-1024x564.jpg] There are more parameters to setup within Adwords, but you basically give Adwords a budget, and that budget is used by Adwords to bid on your ad placements on your behalf against other advertisers. You’re probably asking, “But how do I have my ads placed on the RIGHT videos?” Here’s where the tricks come into play. Adwords allows you to not only bid on the general types of videos you can place your ads on, like folks interested in sports, or technology, but you also have the ability to specify what exact videos and channelswhere your ad is shown on. Provided those videos or channels you wanna advertise on accept monetization, you can selectively choose to advertise on them. This gives you hyper-focused marketing. 5-THINGS-S03E03-YouTube-Tips-and-Tricks-For-Your-Media-adwords-placements [http://5thingsseries.com/wp-content/uploads/2018/09/5-THINGS-S03E03-YouTube-Tips-and-Tricks-For-Your-Media-adwords-placements-1024x706.jpg] [http://5thingsseries.com/wp-content/uploads/2018/09/5-THINGS-S03E03-YouTube-Tips-and-Tricks-For-Your-Media-adwords-placements.jpg]Adwords allows you to selectively show your video ad on a specific video OR channel. And what’s to stop you from doing a custom intro for each of the commercials you place? This allows you to speak to the exact users consuming the video and market directly to them. Think about that for a minute. You can reference the content in the exact video you’re advertising on. You can speak to the people who are watching that exact video. Instead of casting a wide net, you cast a narrow, highly targeted advertising bullet. We’re talking assassin type precision. Do you have more YouTube tips and tricks other than just these 5? Tell me about them in the Comments section. Also, please subscribe and share this tech goodness with the rest of your techie friends. They should thank you and buy you a drink. Until the next episode: learn more, do more. Like early, share often, and don’t forget to subscribe. Thanks for watching. The post YouTube Tips And Tricks For Your Media [https://5thingsseries.com/episode/youtube-tips-and-tricks-for-your-media/] appeared first on 5 THINGS - Simplifying Film, TV, and Media Technology [https://5thingsseries.com].

12. sep. 2018 - 13 min
episode YouTube Tips And Tricks For Your Media artwork
YouTube Tips And Tricks For Your Media

On this episode of 5 THINGS, I’ve got a few tricks that you may not know about the 'Tube. Little known upload tricks and workarounds, playback shortcuts, captioning hacks, audio compression workarounds, and voodoo that you may not even know that YouTube does. 1. Upload Tricks 2. Captioning Hacks 3. Playback Shortcuts 4. Audio Secrets 5. Advertising Tips

12. sep. 2018 - 13 min
En fantastisk app med et enormt stort udvalg af spændende podcasts. Podimo formår virkelig at lave godt indhold, der takler de lidt mere svære emner. At der så også er lydbøger oveni til en billig pris, gør at det er blevet min favorit app.
Rigtig god tjeneste med gode eksklusive podcasts og derudover et kæmpe udvalg af podcasts og lydbøger. Kan varmt anbefales, om ikke andet så udelukkende pga Dårligdommerne, Klovn podcast, Hakkedrengene og Han duo 😁 👍
Podimo er blevet uundværlig! Til lange bilture, hverdagen, rengøringen og i det hele taget, når man trænger til lidt adspredelse.

Start 7 days free trial

99,00 kr. / month after trial.Cancel anytime.

Exclusive podcasts

Ad free

Non-Podimo podcasts

Audiobooks

20 hours / month

Start for free

Only on Podimo

Popular audiobooks