The Medical Futurist | March 16, 2019

Interview with Robert Scoble, Tech Blogger and VR Expert

Imagine that a doctor sits down in Starbucks, places some glasses on his head, instantly invokes five screens and starts doing his diagnostic work. Robert Scoble, virtual reality expert, and tech evangelist believes that will be possible in the coming years – sooner than we might think. He told us why his wife can’t stand immersive environments, when the first VR hardware and software with real customer experience will come out and when we will all know that a new world has come.

The first issue we face when we talk to researchers and clinicians working actively on bringing VR to clinical settings is that patients and their physicians find it hard to work with VR hardware and software. They get dizzy after using rather sturdy devices. They saw that if they just give the goggles to patients, nothing changes.

But if they become coaches, then it even improves the collaboration they have; and then it leads to reducing pain, anxiety, phobias, PTSD and many other psychological disorders. So, how do you see the evolution in VR hardware and software? What is the best we can expect to see that these achieve?

I think change will start coming very quickly. Oh, God. I have spent the last year thinking about how the next 5 years will look. I met companies like Magic Leap, people working at Microsoft Hololens and Facebook, I’m talking a lot to developers. I track over 4000 people on Twitter, and I see where investors are moving money. I already hear about the next Hololens. Facebook set their internal date at 2020 for their goggles, they spend a lot of money building a new operating system, a new brand, developing Oculus, building relationships with developers and making quiet purchases of small companies. That’s good insight for starters, but that’s only 2-3 years ahead. That’s when I think a fascinating device will show up that fixes a lot of problems that you’re talking about.

I know them, too. I have a Hololens, and I can’t use it for very long for a couple reasons: too heavy, too expensive for a consumer – but for a doctor, 3000 dollars might be okay -, the optics just aren’t good enough, even in VR if I virtualize a screen, and I start reading Twitter, inside VR or inside Hololens, I just can’t understand it, it’s not sharp enough. According to developers, it’s probably 2-3 years away before the optics get sharp enough to do that. Those are real constraints, so I don’t think we’re going to see an actual consumer device until 2020 or possibly later, 2022. Although that doesn’t mean you can’t start playing today – you can still start your business today, it just means that you have to put up with the problems of the Hololens or cutting down your dream to just using smart glasses. For example, glasses from a company called North, which used to be Thalmic in Canada, look classy. The enterprise, which received 140-million-dollar funding, developed just a pair of really light-weight glasses with a smart screen on one eye. It’s like a Google Glass done right.

That’s a stepping stone, but that doesn’t mean VR is not exciting. We see devices like the OculusQuest – a 400-dollar self-contained, six-degree-of-freedom device. That means you can move around and play football with someone, throw a snowball, walk around a 3D image or a screen. That means you can put a patient into VR without being a nerd, without having to be tethered to a machine and without spending too much. The average doctor could afford two of them, so you can look at an MRI scanner or help a patient manage pain. The University of Washington is doing pain research with VR, and they found that putting a burn victim into a snowfield game where they are throwing snowballs at him is more effective at some kinds of pain than morphine is.

Another potential use would be the treatment of PTSD or training surgeons for new types of surgeries. You will probably not care about having the most polygons, you’re not building a high-flow video game, you’re just trying to walk through the heart. That’s going to be very possible in VR. On the other hand, already the 200-dollar devices let you see a 360-degree-video, and that actually might have some impact on getting kids over the pain of shots or just taking away the dismal nature of a hospital. As a patient, if you’re lying in a hospital bed with goggles on and feel like you are in Yosemite rather than in a hospital room, it helps you deal with the depression of being stuck in bed for a while.

I believe this year is really a year for joining this new industry, starting to play with it and to understand how to use it. Then, the next two years will be a time of innovation and people building things for the medical industry and later, when the right glasses come you’ll be ready for it. Does that make sense?

Absolutely. If you don’t mind me poking you about your futuristic visions – what would be your highest expectations about the size of these devices and the experience they can provide? What’s at the end of the tunnel?

What year do you mean? In 2019, we’re going to be with devices like Magic Leap. That’s fairly wearable, but a little dorky, it’s not for everything, has its limitations of viewing angle, you can’t get a virtual item really close to you as it cannot render it, the optics aren’t there yet. However, it’s fascinating on a whole other level. It really shows you this new paradigm that is coming at us, and it helps you dream a little bit about how would healthcare be done in 2022 or 2025. It lets you innovate and change the culture slowly, start evangelizing it internally and showing people “hey, here’s something small you can do today with this.”

I mean, my dream is a pair of glasses that feel like the ones I’m currently wearing: fairly light-weight, not nerdy, has a trustworthy brand and gives me a view onto the real world that I don’t have today. Let’s me play Pokemon Go, work on structures in 3D no matter whether I’m a car designer wanting to see my car on the floor, or a doctor wishing to look up patient data. The data remains very private – nobody can see what’s on my glasses. I can visualize data coming out of the new machine whether it be an MRI image or a pre-visualization of a surgery. And that world is happening, it’s just a little out there.

You’ve mentioned different VR, AR and MR devices and you call them immersive reality or XR.

It’s a spectrum of devices. Even with VR, there are two or three degrees of freedom, which means you can look around, but you can’t really move. But all the games require you to move, so for that, you need six degrees of freedom. Now your cost just went from 200 to 400 dollars. Those are coming next spring from HTC and Oculus. I’m really keen on the Oculus Quest.

On the other hand, some people call it spatial computing. I like that term, too, but it’s so confusing. People are just starting to hear about VR, but there’s already AR, MR, XR out there. I call it immersive because it puts you into the computer. What we’re really willing is a better relationship between you and the computer. I remember when I started out with a computer to copy a file. I had to type a command into DOS, and then the Macintosh brought the practice that you could click on a file and drag it from one folder to another. In VR and AR, you just grab it and move it, like in the real world, and the same thing with AR so it puts you into a computing environment with all surfaces around you.

That’s very interesting how you put it because many studies draw a hard line between VR and AR, AR and MR, because of how they interact with the environment, how much the users can see, so you’re actually saying that this will become one big group of spatial computing or immersive reality devices.

I could see 10 years from now, we’re going to have a pair of glasses that turns black and does VR, AR or XR, whatever you call it. People are sensing that that’s where we’re going, so how all these definitions for sensing devices make sense – there’s certainly a difference between a Hololens and an Oculus Quest.

But you’re going to see some of the features of the Hololens on Oculus Quest. You can see some of the features of the Oculus Quest on the Hololens, too, so it’s hard to make a real definition other than with the Hololens you see the real world, and with the Oculus Quest, you’re in a black box not being able to see your environment. However, there’s a camera on it so you might see the real world through that. Also, you can do AR with an Oculus Quest, meaning overlay information on the real world. How else would you see people playing virtual tennis? That’s why it’s so hard to have strict definitions.

I put them all together because as a group they’re playing in the same league: every single one of them is placing you in the middle of the computer. And I almost wish we had called it ubiquitous computing or something like that. But I guess the name keeps also shifting because the marketing teams come up with a new name for their device every time a new one comes out.

Marketing guys can come up with amazing videos, I saw one about dissecting a human body through Hololens in MR. I use that app myself because that was a technological sublime feeling. And I saw another video on Virtuality about Hololens showing how a future medical visit at home would take place. They show a patient in a Hololens and a doctor sitting in front of him.

The physician talks to him while relevant data appears on the wall. But all I could think about was that that doctor was sitting in a studio looking into a camera, but not the patient. Do you think there can be a change there? That they could both have the immersive feeling?

I think you’re going to start seeing that shift. If a kid at Carnegie Mellon Medical University buys two Hololenses for research for 2500-3000 dollars each – that’s hard to justify. But these Oculus Quests are 400 dollars so I can see a student buying two of those and then starting to build a system to work together in these glasses. I mean that’s what Oculus is showing: how people can play tennis against each other, etc. So, if you can play virtual tennis, you can do surgery training, or you can do a game with a patient where you’re both in the same game. You can both see the same thing, and you can both talk together. And that’s going to lead to where we don’t just do a technology demo, but do something real.

And that gets real people to see the magic – because really, the cost is about interactivity. Six degrees of freedom is a lot more expensive, but it’s also much better. If I’ve got a 3D scan and I want to visualize your heart with you, I want to put you in the headset, then get in a headset to see what you’re seeing. Then, I want you to look at what I’m looking at. All the more, I want to spin the heart around and zoom into it – all of a sudden we’re changing the patient-doctor relationship. Your patient will understand the problem in a lot more detail. And that’s coming by the end of next year.

That team building this solution will have the skills to take it into the glasses when the right glasses come – and that’s really when everything changes. When the doctor has a pair of Apple glasses walking around and the patient walks in with the same, they can instantly start up an app and look at something – that’s when you’re going to know that the entire world has changed. The future starts there. But this world is 3-4-5 years away, maybe even a little bit more for many people.

VR

What challenges do you see when people start using the devices we have today? What could be the obstacles in adoption?

I can enlist a couple: weight, cost, the nerdy factor, not enough viewing angle, not being sharp enough or the fact that you’re in a black box. If you want to set up an Oculus today, you have to buy a PC with a video card, set up two sensors, then you have to be in a black headset with the headset tethered over to the PC. It’s expensive, it’s not interactive – it’s really limiting. Only certain doctors would even try it – those willing to put up with a lot of nerdy pain.

But a lot of those problems are going away this next year with the Oculus Quest because there’s no tether, there’s no computer, it’s 400 dollars. You’re still in a black box, but you’re not tethered up, you can walk around in a room with it, and start doing things. That’s really exciting. The black box is still going to keep certain people away – for example, my wife can’t use VR, she feels ill in it, and she doesn’t like to be in a black box. Her problem is not going to be solved until the right glasses come along. Well, I’m going to buy the next Hololens, and we’ll see how she deals with that.

Also, there are not enough apps. So, even if we could afford it, even if the right glasses came out tonight, our software teams are not ready to build software, they just haven’t had time to learn the techniques and start getting the funding. And that’s going to be a massive limitation for enterprises. It’s not going to be useful in the medical context until custom software are written for this, and that’s going to take a few years. I think that’s why Oculus Quest is so essential. You start with a 400-dollar device as an innovation team that lets you sell an idea to the bosses, and say “Hey, can we get a million dollars to hire a couple of programmers to start thinking through this stuff and start, you know, getting ready for the next world?” And that’s how things will go in enterprises. It’ll be slow, but in 10 years from now, I can see it all coming together.

How do you think immersive reality devices will transform our daily routine and everyday life? What is your personal vision about how these new tools will get implanted in our lives?

The first thing is that you get many virtual screens around you when you’re in an Oculus, no matter whether in VR or AR. And that’s a huge change because you don’t need to sit in an office anymore – you could be at a Starbucks, with 5 displays in front of you. Plus, you’re going to be able to see those screens in a plane, on a subway or talking to a patient. So, you start thinking about virtualizing screens. The doctor will be able to see your patient record privately with the virtualized screen in front of you. That really does change a lot – just that simple step.

You don’t really need to focus on the weird mixed reality part of it where monsters come out of the walls or something. Just getting virtualized screens is going to reduce cost in a medical situation and increase patient care because you’re going to have a doctor with a lot of information right in front of you all the time. If he meets you in a cafeteria, he can talk to you because he can see all of your charts right in front of him. And then you start seeing all the magic.

For example, at Stanford, the VR lab did research with a sports team, and they found that the quarterback does a better job when he watches his plays over and over in VR because his brain starts seeing things that he didn’t see even on the playing field. It’s the same thing with the surgeon. If you practice your surgery a 100 times in VR when you do it for real, you’ll be much more ready for a complex procedure then you would be if you trained out of watching videos or trying to do it in a classroom.

When you start thinking about how to work and interact in 3D, that’s when things really begin to change because you’ll demand your machines to spit that 3D image into your glasses. Why do you see an ultrasound in just flat? How come it doesn’t make a 3D image? Why doesn’t the MRI make a 3D image? Why can’t I see your heart rate in 3D?What if I could walk around it and maybe see a pattern that I can’t see today. And on and on.

And it’s not just the visualization that’s happening also, I mean, blockchain is bringing innovations to medical records. Artificial intelligence is transforming things elevating the quality of care. An Israeli company called Zebra Medical Vision trains MRI scans, and they’re claiming they’re more accurate and faster than trained physicians. And if you add a qualified physician to the process, it gets much better, so costs should come down while quality goes up. I don’t say it will, but it should!

What was the last time you felt like you just met science fiction?

Last night, I was in my self-driving car, my Tesla, talking to my co-founder about polygon machines. Because he’s building a whole library of virtualized things in these glasses. I was like I’m sort of living in the future right now. I was reading on TechCrunch that Apple is taking a more efficient approach to 3D printing. They just have a patent on how to create 3D printing models, so if you have a pair of glasses and you took a picture of something in 3D you might want to print it out.

I’m saying it’s going to be a fun few years, we’ll see a real change in how humans think about computing. When you’re wearing goggles, and you’re talking to the glass, you’re seeing virtual Pokémons walking around. The whole is no more than 4 years away, 3-4 years for an Apple user and probably for somebody like me, a year away. So the next we’re going to see a lot of change on how humans think about computing and how humans think about working, entertainment or education.

You know, I’m an optimist myself but you made me even more excited about the future. On the other hand, how do you see the dark side of technology?

Of course, privacy comes to mind first. That’s going to be huge because these glasses should have eye sensors on them, and should be able to understand you at a profound level. So, you have to start thinking about ethics, especially in the healthcare field. We’ll have to think about how to keep a patient’s data secure and private, how will the data be used. These glasses will see your heart rate and your electrical system at some level. And taking it a little further to 10 years away, there might be real brain interfaces in these glasses, where you think of something and it appears. This leads to a whole lot of ethical problems. Even with VR.

Second, the government’s been using VR for treating PTSD, but I believe if it can solve PTSD, it can give you PTSD. And, if, let’s say during the Las Vegas shooting somebody was in a 360 camera live streaming with 100 thousand people watching it, wouldn’t all those people have PTSD, too, just like you would if you were there on the ground watching somebody gets blown away next to you? So we have a lot of things to think about with this new technology and build defenses against it. We have to prepare a bit longer term than we did when we brought Facebook into the world because we didn’t consider any of the problems we see there.

VR

Do you see people losing touch with reality because of these new devices?

Well, my sons are wholly addicted to YouTube on their iPad. Wait until they have YouTube that they’re in with a 360 camera. I put a 360 video made by National Geographic up on Facebook, which is enjoyable and real if you watch it in VR. It gets much closer to reality than watching a video on YouTube. So if YouTube is addictive on an iPad, wait until YouTube is 360 inside a VR headset- that can be a real problem for some people.

And we’re going to overlay a lot of information on top of the real world. That’s dangerous, just think about Pokémon Go. It could get you killed because it gets you to run into the street without thinking. I saw 6000 people running across a park to collect a Pokémon, so wait until a Pokémon actually looks like it’s walking on the ground or rolling across the street. You might be so focused on that that you run right in front of a bus. The creators of Pokémon will have to worry about that and solve the issue. People are even killing themselves taking a selfie with a phone, so this is a real problem. You’re going to be very addicted to playing a game or watching a music video in these glasses, and that can put you into a dangerous place.