Luca mefisto

Luca Mefisto

Software Engineer Oculus VR

Luca Mefisto is a software engineer at Oculus VR and the director of MefistoFiles.

Description

In this episode, we talk about what it looks like to be an AR/VR engineer with Luca Mefisto, software engineer at Meta Reality Labs. Luca talks about being drawn to augmented reality and virtual reality early on, the main tools you need to know in order to be an effective AR or VR engineer, and some of the most interesting things he thinks people are currently doing with AR and VR.

Show Notes

Transcript

Printer Friendly Version

[00:00:05] SY: Welcome to the CodeNewbie Podcast where we talk to people on their coding journey in hopes of helping you on yours. I’m your host, Saron, and today we’re talking about what it looks like to be an AR/VR engineer with Luca Mefisto, Software Engineer at Meta Reality Labs.

[00:00:20] LM: It will be a big mistake for hospitals, for example, not to have it because it’s like this is just cheap and powerful and it just works. I think that’s the ultimate power of VR.

[00:00:29] SY: Luca talks about being drawn to augmented reality and virtual reality early on, the main tools you need to know in order to be an effective AR or VR engineer, and some of the most interesting things he thinks people are currently doing with AR and VR after this.

[MUSIC BREAK]

[00:00:55] SY: Thanks so much for being here.

[00:00:56] LM: Oh, thanks for having me.

[00:00:58] SY: So Luca, you’ve been building things in the augmented reality, in the virtual reality space for many years, and now you are an AR/VR engineer at Meta Reality Labs, formally known as Oculus, which is one of the biggest, most major VR headsets on the market. Tell us about how your coding journey first began.

[00:01:19] LM: I’ve been doing AR for around 10 years now, also VRs, as soon as it started. And I think as soon as I finished my studies in Spain, I realized interaction was the type of thing I like, like how humans interact with computers. That really clicked with me. At the time, AR was just barely starting. The new VR, let’s call it, was non-existent, but I started to make my little experiments, et cetera. My first job, I was like, “Oh, I wanted to do this. I wanted to do some augmented reality and I wanted to do interactive stuff.” So being in Spain, it’s a small country, doesn’t have a big industry for interactive stuff. So I moved to the UK instantly and got a job, augmented reality applications, a bit of augmented reality and a bit of everything. But after a few years, we had this big moment in VR in 2013 when Palmer Luckey introduced the Oculus Development Kit 1, and it blew everyone’s minds of many, many developers, like me who were there in 2013, and we tried this prototype first virtual reality headset. You didn’t have any positional tracking. The screen was really low resolution. It makes you sick sometimes if you use it a lot. But for many of us, it was like the moment of, “Okay, this is what I wanted to do. I think this is going to be a big industry. I think this is the moment I’ve been waiting for ages,” because I wanted to do interactive stuff. And augmented reality was falling a bit short in 2013. So I put all my eggs in that basket pretty much. We started doing virtual reality in the company I was working on, Hidden Creative, in Manchester. But in 2015, I was not full-time working in VR. So I decided to quit and started my own journey, started building my own projects, building games, building experiences for clients. Every time getting a bigger client, many different industries were trying to jump into VR and they were looking for people like me. So I’ve been growing and growing and growing, trying a bit of everything from all the different industries, how to apply VR for them. And now I’m here, building Interaction SDK at Meta. So it’s a dream.

[00:03:31] SY: Take me back to 2013. As someone who has been looking for and been interested in interaction aspects of development for a while, as someone who’s done AR and VR before Palmer Luckey introduced the Oculus SDK, why was that moment in 2013 so important? What was the big deal for someone like you who knew what it was like before and is now working after? What was that moment like? Why was it so important?

[00:04:00] LM: I think the moment was very important for us more in a psychological aspect than a technical one. Because, for example, when some colleagues tried it, they thought it was shit, to be honest. “Oh, it makes me really sick. This resolution is terrible.” But some other engineers were like, “Wow! I just put this in and I really feel I am inside of this.” It was more this feeling of instant immersion, to see past all the problems that it had used to stay there, stay calm, put a headset on and try to feel the moment. I was like, “Wow! I never feel this with technology ever like this.” This feeling, yeah, of presence of immersion. So that was the big moment I think many people, many of us had. And some others didn’t make sense because the technology was not quite there at the time. Some people were comparing it to the 3D screen, 3D TVs and stuff like that.

[00:04:52] SY: Right. Right. Yeah.

[00:04:52] LM: But some of us, we understood, it’s not a new format to enjoy the same content. It’s not a 3D TV. It’s not a cinema screen. It’s a new medium where the rules we have learned about maybe theater or video games or cinema cannot really be applied in here. It’s not going to be a gimmick or a toy. It’s going to be something that really finds everything else around it. And some of us, I think we saw that quickly and decided to just risk everything pretty much, because we could be unemployed right now, to be honest.

[00:05:29] SY: So before the Oculus SDK came out in 2013, what was it like to build in VR? When you think about your toolset, the headsets that were available, just what it felt like to be an engineer, trying to build out these experiences before the Oculus came out. Because the thing that’s interesting about VR is like you said, it feels new, but it’s actually been around for a while. It just looked different. Right? It kind of felt and looked different and has evolved, but the promise of VR and the idea of VR has been around for a long time.

[00:06:02] LM: I think at that point VR was pretty much completed. No one was asking for it. There was no demand for it. At that time, augmented reality was all the rage, I think much more than it is right now, like 2010, 2012, all the companies were asking for augmented reality stuff, even though at that time was just fun. But VR was really proper then. Like we tested this in the ’90s. It didn’t work. Why will we ever try it again? We know it’s terrible. We know it doesn’t work. So no one really was asking for VR. But as someone interested in interaction, you had some things like, for example, in Nintendo, with the Wii Motion controllers or some contraptions to use video game consoles and stuff like that in a more human way. So that’s what we usually do. Like, I mean, maybe, I don’t know, play a game with the Wii modes within our PCs. So you have to do the gestures. Maybe put some voice recognition in other games so you can play with them in a more immersive way, but we didn’t have any way to put the screen in the head. It was too expensive, too bad to be honest.

[00:07:09] SY: So after Oculus came around and you got re-excited about VR and kind of the potential of it, what was the potential that you saw for VR? I think for most people it reminds people of gamers, right? And this idea of kind of having like fun experiences in a virtual space. But I think for a lot of people who aren’t as familiar, it does still feel kind of gimmicky. As someone who is really into the space, what was it that you were hoping to build with VR, with this new technology? What kinds of things did you see yourself creating for the world?

[00:07:44] LM: So at that time in the company I was working with, we were doing a lot of training for people working maybe in big energy industries, like maybe you have to build this offshore wind turbine and we’ll have to take you in a helicopter to the middle of the sea. So you can climb the wind turbine and fix it. And we quickly understood that VR is never going to replace, well, I think not anytime, so it’s not going to replace going there to fix it yourself. But many of these people were like, “Oh, you read this 200-pages manual. We put a test, then we will have to take you 10 times to the wind turbine so you can learn how to fix it.” And we understood that if they do it through VR, things get memorized much better, things get automated better in the brain. Like if it’s done well and that doesn’t mean realistically looking, but understanding the problems with the technology or how the human brain works. If you do it in an intuitive way, you can take that person to that point offshore turbine a hundred times, and then you can go there by helicopter just once and they will know what to do. This is one of the first things we saw like, “Oh, this makes a lot of sense to use in training.” Much later in my life, I discovered the power of healing through virtual reality. I work with an NGO here in Granada and we use it to rehabilitate patients who suffer a stroke.

[00:09:15] SY: Interesting.

[00:09:16] LM: It’s so realistic to them. And this is where I put pretty much all my work into making interactions be really intuitive so they don’t have to think about how to work with it, how to press the buttons or grab an object. They just do their exercises around a table, in a house, not in 3D, but exercises they could very well do with the doctor. But now they can do it in VR. We can measure way more things like the speed of their hand, how long it took them to look at this object. We can automate from the mindset, automate it thousands of times. So it’s exactly the same exercises they do. But now in a team, one doctor who sees one patient, he can see 20 now and they can take it home and do exactly the same and not just us, but anyone else working in these sort of projects, we have discovered that it works really well to a point where we believe that as soon as we finalize our studies as an industry on health, it will be a big mistake for hospitals, for example, not to have it because this is just cheap and powerful and it just works. I think that’s the ultimate power of VR.

[00:10:25] SY: Are there other industries you’ve looked at or heard of where VR can really make specifically a positive impact in terms of helping people and really making a difference in people’s lives beyond just kind of gaming and entertainment?

[00:10:41] LM: Yeah. One of the big sales that they are trying to put in VR is the social aspect. I get this question a lot of, “Doesn’t it isolate you too much?” I think it’s exactly the opposite of being isolated. It’s more like a social experience. When you play a video game or you just talk like we are talking right now over a podcast on a microphone, we are having a conversation, but it doesn’t feel that social. Right? And I’m still doing it alone in my room. If we were doing it through virtual reality, I will feel your presence. We could feel our presence one next to each other. The social power of VR is one of the four big pillars of VR, of presence. There are some theories of what presence mean, but they all agree that the social aspect of human life is one that VR really supports really well. So staying together with people while in your house is really powerful and it’s also really powerful, for example, for remote work, where sometimes you are working remote. You never see your team. And sometimes you just have a Zoom call with them and you just see them on the webcam. But having that, even if it’s just an avatar that doesn’t even look realistic, if they move and are portrayed in a believable way for your brain to understand, after 10, 15, 20 minutes, you don’t really think that’s not a human and that’s not a presence you have nearby and that’s really, really powerful. I think it’s a proper good impact. It doesn’t mean you shouldn’t go out and meet other people, but just at the time you spend at your house talking with other people, it can be like if you were really there with other people.

[00:12:21] SY: So let’s get a little more technical. What are the main tools that you need to know to be an effective AR and VR developer? What are the frameworks, languages, platforms? What are the things you need to know to build the kinds of things that you get to build?

[00:12:40] LM: For virtual reality, the main go to solutions are game engines. They are still called game engines, but I don’t think they’re at all, which is Unreal and Unity. One uses C++. The other one uses C#. Probably 80% of the virtual reality applications are built with Unity. So it’s definitely a good tool to have your tool set. If you don’t fancy either Unity or Unreal, there are starting to be more and more web virtual reality applications that run on top of the web. So you have things like Three.js. On top of it is something called A-Frame that runs with JavaScript, and supports virtual reality. They are still lacking some features, optimization. There are many big optimization problems in virtual reality. It’s a problem with WebVR. So most of the WebVR experiences you see around feel subpar compared to the ones built with Unity or Unreal, but it’s definitely there. And I think it’s going to keep growing and growing and growing. That’s the idea of WebVR. So that’s for VR. For AR, apart from those, you can use Unity. Probably you can use Unreal as well. But I have barely ever seen an augmented reality application built in Unreal. So probably Unity is a more clear winner here. But you also have tools from Snapchat that don't necessarily require code. You have tools from Instagram. You also have a WebAR solution that uses JavaScript, too. So there are a lot more of different solutions around.

[00:14:12] SY: So if I want to start with AR and VR as a developer, as a software engineer, is there any kind of prerequisite that I should know, like math for example? Is getting into math a really good idea? Should I be a really good JavaScript developer first? You know what I mean? Like what are kind of the prerequisites, if there are any, before getting into some of the tools that you mentioned?

[00:14:37] LM: I think at least it’s getting easier and easier and easier, tools like Spark AR and Snapchat one doesn’t even require you to know how to code if you want to build something really simple. And in Unity, you have a lot of frameworks you can build on top of it. For example, I'm building Interaction SDK, which is for interactions in VR, but you have AR foundation by Unity to simplify your augmented reality applications creation. So to get started, I think you don’t need a lot. To get really good, math is something… it’s true that while you are studying, many people say, “Oh, you’re still be doing math and math is what you need. And to be a good programmer, you need to be good at math.” That’s usually not necessarily true. And I know many programmers who are not great at math and they barely ever need it. But I think if you want to do things in 3D space, at least your algebra from where you were 16-year-old, that level of algebra, you have to know it by heart. It’s not difficult math. The difficult thing is to develop the intuition of that math because you’re going to have to solve a lot of very creative problems that happens in a 3D space. If it’s not for an interaction, it’s just for graphics problem. Like I want to do effects, I want to move around this space, mid level algebra is important to start developing that intuition, I think. Other than that, not much to be honest. And I don’t think that’s even a requirement to just get started. But I think when people feel like they hit a roadblock past a few months of working and learning and they want to speed up the development process it’s because they usually lack that.

[MUSIC BREAK]

[00:16:35] SY: What I find really fascinating about the VR space is it’s being invented, created, and then you have to learn it at the same time. It’s not like other fields, other industries where it’s been studied for a while and then kind of translated to books and courses. It feels like it’s kind of all happening at the same time. So how do you handle learning when you’re being introduced to technology as it’s being created and also have to learn it enough to use it? How do you think about your learning process and what works best for you?

[00:17:13] LM: That's an amazingly good question. Because for me, when VR started, it was 2013. I’ve been working just for one year. I had a master degree, but then just been working from 2012. And when I saw it coming, it was like, “Oh, wow!” Suddenly, this is once in a millions opportunity because everyone suddenly is put in the same level. I have the same level of experience in VR than this guy who is 40 and I'm 20 something. So I have to use it. Because, of course, if a new web framework comes out, everyone has to learn it from scratch, but the ones who have a lot of experience already on web frameworks probably can get it quicker. But since this was like a whole new paradigm, it’s not just a technical skill on how you code, it’s also how you think about solving creative problems, not in the code space, but even in how do you do a video game on this, what language makes sense, can you make a movie, but you cannot force the user to look in directions. So you have to rethink, even the language of theater, et cetera. So it really made tabula rasa with everyone and we all started at the same time.

[00:18:20] SY: Right.

[00:18:21] LM: So for me, the way to learn was really meetups, I think. It was very community-driven at the beginning. It’s still quite community-driven I think. There were not many of us who were doing VR at the beginning. So we all got together, knew each other. I started VR Manchester. Everyone who was starting in the industry got together in there and once a month we talked about our discoveries, what we’ve learned, and stayed in contact over Slack channels and Discord channels with developers all around the world. I attended talks, but also prepared my own talks. There were no talks anyone was an expert in. It was more like, “I’m going to learn this so I can give a talk so we can all learn collectively.” So that was really the way of learning in virtual reality. Now I will say it’s a mixed bag.

[00:19:11] SY: Okay.

[00:19:11] LM: I think it’s still very important to attend communities. There is still a lot of innovation happening in this space. The best way to stay on it is to know who is building what and checking what they’re doing. But also, now, for example, Unity, YouTubers, streamers are starting to put out a lot of content to at least get you started. So I think it’s important to follow these people, get you started. Unity has a lot of free resources and courses on how to start understanding it. And once you have a good control of it, I think the most important thing right now is to download experiences games and demos. Try everything. Be very critical on how they’re doing it because it’s still a bit of the Wild West. And you cannot trust every single game that you are testing or you’re playing is well done and it’s following the language or the good patterns, et cetera. So basically you try demos or theater or whatever. Check the community and at least Unity, Unreal, et cetera, now have more like kickstart courses. But as soon as you reach a level, you just want to talk straight to the people who are building stuff.

[00:20:27] SY: So when I think about building in VR and AR and just hearing you kind of talk about it, it feels very similar to game development. Where did the two start to diverge?

[00:20:39] LM: Well, the language as an art, let’s say, is not the same in video games that it is on VR necessarily. That many things you cannot do anymore in virtual reality that you could do in video games, but that just for that language part. On the other sense, it’s also an industry. So yeah, you can do games and you can do VR games and that’s becoming fairly profitable now. It was not five years ago at all. It was suicide to make a VR video game. But you have to understand now that you can apply VR to pretty much any industry. So all these, more these soft skills of understanding your client needs, understanding that you’re making like a product that is going to be deployed to a lot of, I don’t know, maybe workers or maybe health people or stroke survivors, et cetera. So understanding that you are not building a game, even though you’re using the tools, you will usually build for a game. Now you’re building more like a professional application. So it’s a bit of a mix bag. One problem we usually see with many game devs, and I'm a game dev myself, is that most games are built in a way like it just works. It just has to work. It’s all cheating your way to finish the video game, just put a plastic on it and send it. But these people now are building products. So they have to think way more about scalability, about refactoring, sometimes things about understanding that software now has to keep growing and following more agile methodologies, et cetera. So it can grow with the companies, et cetera. If they’re making just a VR game, it’s pretty much the same. It’s just the language that changes.

[00:22:24] SY: So let’s get into the work that you are doing on Oculus right now. LinkedIn says that you are developing an interaction SDK. What is that all about?

[00:22:34] LM: Yeah, Interaction SDK is a set of tools mainly focused on hand tracking, which is the technology to be able to use your naked hands without controllers in virtual reality. Up to this point, Oculus was just literally offering developers just the values of the hand basically, like, “Oh, yes, I see your hands. Your index finger has the distal joint at these degrees and the proximal joint at these degrees. Off you go.” Right? But if you have this massive amount of direct data every frame, what does it mean to grab an object to poke a button? What does it mean to point at something or to do a gesture? Right? Every time that’s what we are trying to solve. We try to give people a library where we say, “Oh, do you want to do gesture detection?” To understand that the user is, I don’t know, pointing out something, doing a thumbs up sign or doing a T post, like asking for break time. That, we are building it for them. If you try to close your hand around a virtual object that doesn’t really exist, how do you code it in a way that the hand looked like realistically grabbing that object and the object becomes attached to the hand? Because, again, your hand exists, the object doesn’t, et cetera. That also usually requires a lot of code. So we are solving that as well for users. Even things like how do you move around the world with just your hands. It also works with controllers, but we’re trying to put the focus on hand tracking because it’s a difficult problem here. Even just something as simple as pressing a button with your fingertip, right? If the button doesn’t exist, how do we make it feel great that you are pressing something? If you don’t even have the haptics, the vibration on the controller, you have nothing. It’s just your naked hand. How do I make you feel, the user, that that was a physical button that really existed and you feel like you touched it?

[00:24:30] SY: What is it about hand tracking that makes it such a difficult and interesting problem to solve?

[00:24:37] LM: Yeah, the main problem right now is for hand tracking, Meta Quest, for example, the headset has four cameras. And from these four cameras, it can tell your position in a room, but also it uses the cameras to detect where your hands are. So it’s using machine learning to differentiate your hands from the background and it returns you to what we call a hybrid skeleton, a hand skeleton, which is the position of each one of the bones, of each one of your hands. The problem is it’s all computer vision-based machine learning. So if you move your hands really fast, the algorithm may fail. If you block the vision of one hand with the other hand, you might not know exactly how the hand has the fingers. And if you put your hand behind your head and the cameras are not seeing it, you also don’t know what the hand is doing. So it’s a lot of data per hand being received every frame that is also really, really noisy because it’s all based on computer vision. On top of that, your hand exists and you can feel your hand and you know it has a weight and you feel when you’re touching things. But in virtual reality, if I draw in front of you a bottle that doesn’t exist, you won’t necessarily feel that you’re touching it because, of course, the bottle is not there. And for example, if I went for the naive implementation, let’s say I take your data and I make the hand in virtual space be like a physical hand that can collect with things. Now you try to put your hand near the bottle. What do you need to do for grabbing it? You have to stop your hand on time, because if you push too much, you will push your bottle away and it will fall. So that’s already a problem. If you stop too soon and start closing your hand, you will push the bottle again with your fingertips. You have to really stop your hand exactly where the bottle is and carefully close each finger so it grabs the bottle nicely. And that's a terrible user experience. Isn’t it? So it’s such a difficult problem to make you feel like your hand has a presence that your hand is really there, but it can be solved. It can be solved through great design, trying to adapt to what the user intention is, trying to use good sound as well. If you move your hand fast near a bottle and close it, you should just grab it. Right? So this sort of a problem, it would very difficult to use hand tracking. There are many problems. One that I love that is the most simple one is how people point. How do you point at things? Because if I’m giving you the data of your hand, many people will say, “Oh, it’s just the index finger.” I’m extending my index finger. I’m pointing at something away. Whenever I ask this question to students, most naive implementation, they say is just take the direction of the index finger and that’s where they’re pointing, and that is incredibly false. One specific thing is if you have any one angle degree error after a 20-meter distance, you are meters and meters away. The right way, for example, Oculus suggests for pointing is you take the position of the fingertip. You don’t know where your shoulder is, but you can estimate where the shoulder is based on the wrist position, what you will estimate the elbow position, you have the head position and then you estimate the shoulder position and you trace a ray from that shoulder to your ray and pointing ahead. And that is actually closer to what the user is pointing at and just following the finger.

[00:28:11] SY: Interesting. Yeah.

[00:28:11] LM: So even such a simple and something that when you think about it, yeah, this is a trivial problem becomes a massive problem. It’s even trickier because when you’re pointing up, you still estimate from the pelvis of the user and not the shoulder. So just something so simple like that can make or break an app.

[00:28:36] SY: Coming up next, Luca talks about the different pros and cons of working in a small company versus working at one of the bigger, well-known commercial companies currently leading the AR and VR space after this.

[MUSIC BREAK]

[00:28:59] SY: So you’ve worked for both big and small companies, as well as for yourself in the AR and VR space. What are some of the biggest differences in working in these different capacities and different environments?

[00:29:12] LM: I think for big companies, especially what I’m working right now, what I’m really liking is they care about having things done right, not necessarily fast, like just focus on finally nailing it, finding the proper solution. Working in smaller companies, especially in VR, this being this sense of urgency because most of the clients you will have up to this point usually was not very big companies. It was companies that were using VR, I don’t know, for a new use case or something where they have to demonstrate that it works really quick. So it’s quite stressful sometimes, that you need to get things done very quickly, but everything is three times more work than doing a normal application. No one knows anything. Even the client itself is figuring out what he needs as you are building it because they have never tested. Everything is so new that they don’t even know exactly what VR can do for them. So you are building a prototype, at the same time you are building the product and everything is more difficult because no one hasn’t solved many of the problems that you have. So it’s a bit stressful, but the good thing is no one has built it before. So there’s a massive amount of creativity that you can put into your product and your project. So you can innovate every single project that feels incredibly fresh and new. My last job, when I decided to finally quit, I was making some VR, I was making AR, but I was also making Android, iOS, Windows Forms apps and it got to the point where I feel I’m building the same over and over. I’m making another panel that comes from the left, another hot spot that you need to play and it felt repetitive. But with VR, you don’t have that. With VR, every single project, especially for a small client, is brand new and very creative and really motivating. I really like that.

[00:31:02] SY: So when I think about VR, it feels like such a big investment, right? The technology’s new. There’s the hardware component, the software component. It feels like if you want to do VR or AR and you want to really make something productive, you kind of have to be at a big company like Meta or Microsoft or HTC Vive, like it feels like you need to be in that kind of environment to really do this kind of work. And I wanted to know. Is that true in your opinion, having worked for both small companies and big companies? And if you don’t work for one of these big entities, what are some of the biggest challenges in building in this space without necessarily the resource or a scale of a company like Meta?

[00:31:50] LM: I have to say I’m very happy working at Meta, but I think I’ve never been unhappy doing the other side.

[00:31:56] SY: Okay.

[00:31:56] LM: I never had a bad year, I’ll say, or a terrible year. Just every year has been more and more. And now I’m working at Meta, of course. So I’m happy about that. So I was happy with all my previous jobs as well. The thing with VR is it affects pretty much all industries. Maybe if you’re making games, which is what everyone is doing, you will struggle a bit. But if you understand that the power of virtual reality is that it can be applied to pretty much any industry, you can find a lot of opportunities in anything. Again, I was making VR concerts for some music artists. I’m doing health. I’ve done big industries. I’ve done also theater. All of them want to do VR. They are the ones putting the investment and they are looking for solutions. So right now, if you know where to look at, you will find many places. There’s actually a big lack of professionals. We are lacking professionals left and right. We need more parameters. We need more designers. We know more 3D artists that understand the space constantly. And I’m not talking about Meta. I’m talking about the whole industry. It’s too much work to do because everyone, every single industry wants to see how VR can solve their problems. And the good thing is VR canceled many of their problems. So a few years ago, it was not the case. But now I think it’s this sweet spot right now where finding the opportunities, but there’s not a lot of competition. So you can find your spot, I think. And right now, I have the Meta theme, but I’m also working with this health solution too. And that one is working really well. And I see pretty much all small companies who are doing stuff like that, they always find their place.

[00:33:37] SY: What are the biggest challenges when you’re building things within one of these huge AR/VR companies?

[00:33:45] LM: I guess the biggest challenge is there are many, many teams working in very different sections and sometimes this section overlaps. So when you want to work on something, first, you need to check if another team is working on it. And if they’re working on it, you have to start conversations to see if you like your solution more or their solution more and then you have to have these debates of this discussion to say, “Oh, no, let’s do it my way,” or, “Let’s do it your way.” So I think the biggest problem, if that’s a problem, is you spend a lot of time communicating your ideas and sometimes trying to convince people to go with your idea or sometimes annoyingly finding that you have worked on something that another three teams have resolved. Oculus is such a big company. I mean, Meta Reality Labs is such a big company as well, that, yeah, that sometimes you find a bit of, “Oh, I was working on this, but this is already solved.” But at the same time, how come no one solved this problem already and you have to kind of take responsibility. It’s a pro and a con. Everything takes much longer. Everything has to be crossed validated by a group of engineers, but that’s also, for me, who has been working on my own since 2015 and I was doing well and good, and I like it, but I was missing these conversations with a lot of good engineers about how to do things right. And now I have them every day. That’s really cool. Just some days you just want to code.

[00:35:18] SY: What are some of the coolest and most interesting things that people are doing in AR and VR today that you’ve seen? Maybe within Meta, but also just generally in the space, things that you’ve seen other people built.

[00:35:31] LM: There’s a big application, very famous. It’s called VRChat. It’s just a like a Wild West social network. VRChat itself is well and good, but the cool thing is I’m seeing, for example, theater groups doing live theater in there because they couldn’t go to a proper theater with real people. They do it now on VR chat and they can make a theater with you and involve you in this story. It’s really amazing because you can be part of the story they are recreating for your life and I think that’s really, really cool. All the stuff I’ve seen, I’ve been building the concerts and I think that was a very cool space during the lockdown. For big artists, this is not usually a big problem, but the people who work with these big artists, they don’t have a job. The lighting engineer, what is he going to do? He has to go put lights in festivals and there are no festivals and there are no concerts. So in VR, we have seen now many big artists do their gig, but the tools they provide in VR are tools that can be used by the lighting engineers or their creative team that will build a massive scenario. Now they will build it in VR. They will have a lighting engineer with a mixing table they use for moving lights or just in volumes, et cetera. They will use that in VR. So all these people don’t lose their jobs. They kind of still work exactly the same way with exactly the same tools. They just don’t are not doing a real gig. They’re just doing it in virtual reality. So I think that’s another very cool news. And I guess another cool one, I’m really in love with health.

[00:37:11] SY: Absolutely. Now at the end of every episode, we ask our guests to fill in the blanks of some very important questions. Luca, are you ready to fill in the blanks?

[00:37:23] LM: Yeah.

[00:37:24] SY: Number one, worst advice I’ve ever received is?

[00:37:28] LM: Leave optimization for the end of the project.

[00:37:30] SY: Oh, interesting. Tell me about that. Yeah, that’s a pretty common one, I feel like. Tell me more about that. Yeah.

[00:37:35] LM: It’s a very common one and I know it makes a lot of sense in other spaces, but in VR, you have to run at let’s say 90 frames per second. If you open any normal video game, for example, let’s say at 2D car play video game in your phone, it runs at 24 frames per second. No one cares. So you finish building the game. It runs at 24 frames per second. It’s done. Good. You don’t need to optimize that. In VR, you have the same chip of that phone, but now you’re doing 3D with two cameras. So it’s twice the power that you need and you need to run it at 90 frames per second. And if you go to 89, it’s a no go. You cannot release the game. You can never fail a single frame. So suddenly, if you leave optimization for the end, you will discover that not just you have to refactor code. Sometimes you have to redesign and rethink the entire section of your game. It’s so important that games and experiences in general, they run well and they look believable, but they never break immersion. And for breaking immersion, meaning, yeah, having some problems. I have seen projects where optimization has been easily 90% of the work because they left it for the end and even they have to completely change the entire things just because it’s a very, very hard deadline. 

[00:38:58] SY: Interesting. Number two, best advice I’ve ever received is?

[00:39:03] LM: Get involved with the community. Manchester is a very community-driven city. It’s not a big city. I think it’s 400,000 Americans. But some very cool things have been built there like WordPress or the Raspberry Pi. Every single day, there will be two or three meetups for Android developers and JavaScript guys, Scalar fans, I don’t know. I started with going to the Android meetup because I was doing an Android app. And I met people. I learned a lot of very specific things and very obscure things I love. I ended up giving my own talks that forced me to learn as well. And it also makes your life easier to not just get in the know, but also even land a job later. So it just makes your life easier. And there’s no reason not to do it.

[00:39:52] SY: Number three, my first coding project was about?

[00:39:55] LM: It was a script-cycle using Remotion with a remote sensor and the Dance Revolution dance mat and voice recognition to play games. I have to walk in the spot pressing the buttons and even jump or crouch. And I have to hit people with the remote, everything connected to the PC. And even in Skyrim, they have this thing where you can shout spells. So you have to shout to the computer also to do the spells. This was my first small project just for myself that I really love.

[00:40:25] SY: Number four, one thing I wish I knew when I first started to code is?

[00:40:29] LM: That I still have to pay way more attention in algebra. When I studied that at a uni, I was like, “Oh, this is not for me.” It was one of the worst subjects for me. I did poor. I didn’t care too much about it. I didn’t know I will learn to love it. And now, every now and then, I’m like regretting it and even thinking, “Should I just go back to the university just to take this class again, just to see if I can finally nail it?”

[00:41:01] SY: Yeah. Well, thank you so much again for joining us, Luca.

[00:41:05] LM: Thank you so much. It’s been a pleasure.

[00:41:14] SY: This show is produced and mixed by Levi Sharpe. You can reach out to us on Twitter at CodeNewbies or send me an email, hello@codenewbie.org. For more info on the podcast, check out www.codenewbie.org/podcast. Thanks for listening. See you next week.

Thank you to these sponsors for supporting the show!

Thank you to these sponsors for supporting the show!