AI-Driven Storytelling in 3D Gaming AI-driven storytelling is revolutionising 3D gaming by creating narratives that evolve based on your decisions. This technology creates dynamic, personalised experiences, making each player's journey unique. As a passionate gamer, I find this exciting. Start playing AI-driven games today and discover stories shaped by your decisions! #Gaming #AI #3DGaming #AIStorytelling #AdaptiveNarratives #GameOn #FutureOfGaming #ImmersiveExperience #GamerLife #TechInGaming Original Video: Matt Wolfe
Transcript
In the very near future, anybody is going to be able to create insanely good video games. And I'm not really talking about silly little video games where you have a person that could run around and jump and collect coins either. I'm talking about realistic 3D, insanely good graphics, state-of-the-art games. Anybody can create games like this and I'm going to prove it to you right now. So if you remember from past videos, I've talked about this tool called Blockade Labs where you can create these skybox images that look like 3D, fully panoramic 360. Images by just typing a prompt. So for example, if I want a sci-fi, cyberpunk futuristic world on an alien planet, I could click generate and have a futuristic sci-fi world on an alien planet just like that. That I could look around and it looks amazing. However, this is just the beginning of what Blockade Labs has in mind for their technology. In fact, last month on Twitter they revealed that they're going to combine their Blockade Labs with. Controlling that so that you can actually draw things in. Here's a demo video that they revealed where they're actually drawing on this area here. You can see it's a 3D World that they're looking around, they're drawing doors, they're drawing a ceiling on it. You can see down on the left this sort of panoramic view of what they're trying to draw here. And then they convert it and you can see it's a 3D World with their actual drawings worked into the scene. I mean, just look at this scene that Ethan Mullick shared on Twitter of this. Awesome looking control room Blockade Labs even has an API where you can use these scenes inside of Unity and inside of Unreal Engine. Now they're not in these tools by default, but if you request API access, you should be able to put these right inside of the two most popular game engines right now and use this tool as part of your game creation process. Pretty pretty cool. Now if you're not familiar with Nurse or Neural Radiance fields, a nerf is essentially when you. Find something in the real world and you take a whole bunch of pictures of it from a whole bunch of different angles, and then the AI essentially stitches it all together to make it into a 3D object. Now, that's a real oversimplification, but that's the best way I can describe it. There's a company called Luma Labs that's pioneering this technology, and you can kind of see on the screen here an idea of what it does. You take a whole bunch of pictures around the object, and then once you scan it in, that object becomes a 3D object. Now nurse aren't just for scanning objects, they can also be used to scan locations like this Reddit post here where this Reddit user used it to scan the Hill Garden and Pergola in Hampstead Heath, London. Using this technology you can actually scan in entire environments. Now what makes that really cool is you get stuff like my friend Coffee Vectors did here where he actually scanned in a table in his house with actual objects on it, imported it into Unreal Engine and then had the character run around. Inside of that scene, inside of Unreal Engine. So this is a real table with a real Rubik's Cube in his house that he scanned in. And the character that's running around there is the default character that you can use when you first play around with Unreal Engine. But what about the characters? So far, all we've seen is how to create really cool scenes, either from AI tools like Blockade Labs or using things like neural Radiance fields to scan in real world environments. You can actually use nerfs and scan yourself in, that's what. Ian Curtis did here last year on Twitter. He actually scanned himself using the same Nerf technology and then created videos of himself as a character. You can see this is him actually walking around in a game world using his phone as a joystick. That's the character of himself that he scanned into the game. Here's his character standing as a giant in a parking lot. He used that same Nerf technology and was able to turn himself into a game character. In this Twitter thread here he actually shows the process. You can see here he is getting scanned. He had to hold real steel as somebody walked around him and scanned him into the nerf. Here after a little bit of cleanup he had this 3D character that he was able to pose and model and do what he wanted with. He used the free Adobe tool Mixamo to rig up the arms and the legs so everything bends where it's supposed to bend. And he had a 3D character of himself that he can use in a game environment. But now I hear you saying what if I don't want? Myself to be the game character. I mean I think it would be pretty cool to be the game character, but what if I want something other than that? Well, here's some technology called instruct nerf to nerf. This is from a research paper from March of 2023 here and this is essentially allows you to take these nurse that you generated and give prompts to change them. So here's the original nerf. They write give him a mustache and a mustache is on the nerf. Now they type as a bronze statue converts this nerf into a bronze statue. Turn him into Albert Einstein. Now the nerf is Albert Einstein, turn him into a clown. Now the nerf is a clown. Make him bald. Now you have a bald character nerf, give him a cowboy hat. Now it's a nerf with a cowboy hat. Turn him into Batman. Now you've got a Batman nerf. Turn his face into a skull. Now you have a character that's a skull that you could run around as you can see example characters down here. Here's the original nerf that they scanned in of a person. They put the person in a suit, made him into a marble statue, turned them into a firefighter with a hat, turned them into a clown, turn them into a bronze statue. So you just scan. One person, turn it into whatever you want to turn it into, rig it up and you have now a custom game character. Here's another nerf environment where they're scanning this area with a tree in the center. And with this instruct nerf to nerf, they're able to turn that same scene into an autumn scene. They're able to turn it into a desert scene, turn it into a midnight scene, a scene with snow, a scene in the storm, or a scene in the sunset. So with this instruct nerve to nerve, not only can you change the appearance of a character that you scanned in with a nerf, you can also change the appearance. Of an environment that you scanned in with a nerve pulling all these assets that you created through tools like blockade labs or scanning nurse into a tool like Unreal Engine or Unity and you can easily bring them together to create the environment of a game. But now I imagine you saying but I don't know how to code sure I could bring the graphics into Unreal Engine or unity but then I don't know what to do with them. Well guess what both Unity and Unreal Engine are starting to build AI into their platforms. So right here you can see that Unity has an AI beta program Unity is building an open and unique. My ecosystem that will put AI powered game development tools in the hands of millions of creators. Soon they'll be able to do more quickly, create and deliver amazing real time 3D content and experiences for billions of users around the world. Now they have a little teaser video which quite honestly is a little underwhelming. It doesn't show anything, it's just more teases. It says create a 3D female character. Give me a large scale terrain with a Moody sky at a dozen NPC's. Make them aliens mushrooms. Flying alien mushrooms. Slowly punch in with camera and add some lighting, add some dramatic lighting, add 2 seconds of Thunder. Give me a synth wave look, give me a cyberpunk look. That's really the whole video. That's all they give us. There's not actually any pictures of what this is going to look like yet, but they are building AI directly into Unity to make all of that stuff easier. Remember that tool I showed you a minute ago called Luma I where you can scan these nurse in? Well, they're one of the leading companies with this nerf technology. An Unreal Engine just announced that they have a direct. Integration with Luma labs now so you can actually create nerves and easily bring them straight into Unreal Engine and on top of that, I recently made a video showing you that you can use GPT 4 to write game code for you So now you know how to make the background graphics, make the environments make the characters scan your own characters in change what the characters look like change what the environments look like use AI to create the code use GPT 4 to help you create the code and if all of that. Still feels way too complicated. I want to point you over to this other thread from my good friend Bilawal here where he breaks down how Roblox is now adding AI into their environment as a full blown game creator where you can type what you want into Roblox and it will create games. In fact, remember that game that I showed you at the very beginning of this video? That shooter game, this game that you're watching right now was created inside of Roblox? Yes, Roblox, the game that kind of looks like a cross. Between Legos, Duplo blocks and Minecraft. Yeah, that Roblox. This game was created with that tool. And check out some of the AI demos that have recently been released. You could see a car here on the screen. There's a prompt. Brushed metal diamond plate pattern. The car changes to a brushed metal diamond plate pattern inside of Roblox. Purple foil crumpled pattern, reflective. Boom, there's your purple foil crumpled pattern. That's really hard to say. Red paint, reflective, glossy finish. There you go. So here's another video. That Bill Walsh shared from Roe Builder on YouTube where he's actually creating wall patterns inside of Roblox using AI Rainbow Brick Road. There we go. So if we just grab this, throw it on here, let it load. Look at that guys. So he just typed into this little box here, Rainbow Brick Road, and it created these rainbow bricks on the wall here. This is so cool, dude. You could see he then tried a different pattern for that same prompt Rainbow Brick Road like this, even that man, let's let it load. Boom, you can see. When it's like the base map and then it's just like boom, there's like the everything else that goes on with it, right? That's so, so cool. He's so excited about it and I love it cause I'm so excited about it. And I've never even used Roblox before. And Bilawal's thread here, he says coding is hard and coding for 3D is even harder. Conveniently, Roblox now has AI based code assist type natural prompts and have it automatically converted into Lua, the language of choice for Roblox. So check this out. Up at the top you see the prompt toggle the headlights when the user presses. HO, that was coded based on this prompt. Here you could see it wrote the code. Here on the side. Blink the headlights. When the user presses B, it wrote the code and you can see the headlights blink. Blink the headlights every X seconds. When the user presses X, it writes all the code for them. Now that is just crazy. All of the coding is happening through just prompting inside of Roblox. Here's another example. Make it rain, you can see it starts raining. Make it float, the car starts floating. You'll even be able to generate 3D assets with AI. Search for existing 3D assets that were made with AI. You can see here they prompted brick house with smoke coming out of the chimney. It searched for a bunch of models and found brick houses with smoke coming out of the chimney. Now I'm not totally clear if those were generated right in the moment or if these were already pre generated and it found it. Either way pretty cool. It's going to make game development very very easy. All that to say, I truly believe anybody that wants to create a video game regardless of having knowledge of game development, regardless of having. Knowledge of visual effects or 3D design or Blender or any of the tools that used to be required to create really, really good looking polished games. AI is sort of making it so anybody can do any of that stuff now. Now, are we there yet? Not quite. I think there's a little bit of a ways to go with some of the Nerf technology, getting it a little bit more realistic, a little bit more polished. Some of the scans are still a little rough around the edges and a lot of the examples. Showed you they went back into Blender and cleaned them up a little bit, but I think that's only to get better over time. Same with the 3D generated images with things like Blockade Labs. I think that stuff's going to get better overtime and improved. But right now with the tools that are available to us today, you can probably go and pretty easily create a really killer game. In fact, I want to just jump back to coffee vectors here with this little dude jumping around and I don't believe they had experience with game development. They watched a handful of YouTube. Tutorials and managed to create what you see here with a combination of scanning their nerfs, importing them into Unreal Engine and creating a little 2D side scroller game in Unreal Engine. This was created by somebody who's a non game developer and they're able to create this little character running around inside of this little mini world here on their table. How cool is that now? Are you as nerdy as me? Do you love all of this future technology and all of these cool tools that are rolling out all the time? If you do, check out future tools dot IO. This is the site where I curate all of the coolest tools that I come across. I add new tools every single day to it. I'm really only adding the best of the best tools that I come across. Now, I really think you'll dig a lot of them, but if there's just too many tools and you don't want to sort and sift through them and find just what you're looking for, join the free newsletter. Every single Friday. I'll send you the TLDR of the week in AI. I'll send you my five favorite tools. I'll send you a handful of news articles, a handful of YouTube videos, and one cool way to make money with AI that goes out every single Friday. Again, it's the TLDR of the week in AI. You'll be joining almost 70,000 people who receive this newsletter every Friday to stay in the loop with AI. And if you like videos like this and you like nerding out and you want to see the latest technology and all of the latest AI, AR, VR, all the news of all that kind of cool, nerdy futuristic stuff, give this video a thumbs up and maybe subscribe to this channel because I'm gonna keep making videos like this if you keep on coming back to watch them. And I really appreciate you. Thank you so much for tuning in. I'll see you in the next one. Bye.To view or add a comment, sign in