If You Who haven’t Read about Sora Yet
While I wrote a longer post dedicated to OpenAI’s new video generation tool along with every Vlogger and author I follow, it’s possible you haven’t read about it yet. If you need to get up to speed, you can read this article from Axios about how it will upend traditional filmmaking. Or you can get a deeper explanation of how it can render 3-D space and will be used in ‘world -building’ which is particularly relevant to video games.
I can tell you that OpenAI’s hype machine can say “Mission Accomplished”. After seeing those demos, I am filled with this longing, “When can I get access?” I tried generating some video in RunwayML and it seemed so limited and pointless.
Should Designers Learn Unreal Engine?
I’m currently teaching a Motion Graphics class that focuses mainly on AfterEffects, but I’m (obviously) sneaking in a little RunwayML and Pika Labs. Recently, one of my students asked, “Should we also be learning Unreal Engine?” For those of you not familiar, Unreal Engine (UE) is a set of tools used by video game designers and developers to create interactive environments. It can be used to build simulations, edit videos and sound, and render animations. I didn’t really have an answer for him, but I’ve been stumbling across a lot of tips and tutorials lately, so the answer is YES. Let me qualify that for a moment; if you’re pursuing a career path in motion, or 3-D, you should also probably add Unreal Engine to your skill set. According to animost.com, it can take around 1–2 weeks to grasp the basics, understand the interface, and learn basic functionality. Moving to an intermediate level can take around 2–4 months.
Yes, designers can learn Unreal Engine. Unreal Engine is a game engine that is useful for dynamic lighting, animation, and sound. It can be good for creating games, making indie movies, or wowing clients.Learning Unreal Engine can be good for designers because it can add a salary premium of 5%. It can also be a good choice for those who want to become more proficient in coding.
If this sounds like an interesting fit for you, you can check out this article
Two Figma AI Plugins That are Almost…
Even though I write a newsletter where I enthusiastically champion AI tools, some of them just fill me with dread. Over the last year, I have watched the advance of UI-AI tools that allow us to simply describe an interface. At first they were clunky, and now they are getting really good, but I don’t think it’s a great leap forward. I have been predicting this for over two years, and cautioning my students that entry level UI jobs will be going away. This is partially due to the maturation of UX patterns and Jakob’s law (interfaces should look familiar), but also from the greater reliance on design systems to increase the speed of assembly. To be clear, I don’t think this is a great thing. Dragging components onto an artboard already feels pretty removed from being a designer. If you’d like to now type instructions and have the process automated so that you can free up time to do other things, that moment has arrived.
Edward Chechique has a great review of two Figma plugins, Musho and UX Pilot that will actually speed up your workflow.
Google’s Highs and Lows
lat week I wrote some critical notes about Google’s new Gemini product…for lack of a better term. However, I did a little more reading and perhaps I was a little hasty. While it was overshadowed by the launch of OpenAi’s Sora, Gemini 1.5 was released on the same day and is actually a huge leap forward in terms of capabilities. So if you like reading about things like tokens and context windows, you should check out this article. And if you’d like to understand how this fits into the entire Google ecosystem, you should check out this article. And if you’d like a grim reminder about how many people Google is laying off this month, go here.
Reddit Plans to Sell it’s Data
While it hasn’t happened yet, I suspect that this will become a new paradigm and perhaps provide some sort of resolution for the lawsuits from the New York Times and others. The AI firms will always be hungry for data and are in a competitive race to train their Large Language Models. Media companies are justifiably outraged that they they have been “scraped” without permission or compensation. If the Reddit deal goes through, expect this to become the new normal.
Why it matters: As human data continues to fuel AI’s rise, social media platforms will increasingly find themselves with a goldmine of data ripe for training new models. While Meta and X have their own AI ambitions, others (like Reddit) may be content to reap the sudden external value of their data.
Eleven Labs Will Soon Offer Sound Effects
Eleven Labs is one of my favorite AI tools, and I use it regularly for voice overs in my AI film projects. They just announced that they will soon expand to making sound effects. If you’d like to get early access, fill out this form.
That’s it for today. I hope your three day weekend was fun. I got lots of studying done and I was super productive. As always, if you have comments, suggestions, or requests send them my way and have a fantastic day!