I have a dream

This is what I’d *hypothetically” look like on that motorcycle.

I have a dream..to live in a world where I can (for example, just a random one) be on a motorcycle in the pouring rain, motoring through the hairpin mountain curves of Danang, and have that experience be replicated in a 3d environment so others could experience it as I did – rain, wind, hairpin curves, scenery and all.

AND…interact with those objects as if they were solid… because based on changing angles, I think the computer should be able to interpolate 3d shapes. I’m a dreamer.

Beats sharing flat snapshots any day! All those amazing experiences, sharable in an immersive and interactive experience. How incredible would that be?

PS there are people working on this. Intel for one, the Nokia Ozo another..and I’m sure many others.


Upcoming appearances

Am honored to be on stage at two events this week! Very excited.

First up, I’m at the innovation festival Propelify on Thursday, where I will be fireside chatting with Beatie Wolfe  for 25 minutes about music’s interactive future (South stage, 2:20 pm).

Beatie is a musical innovator in addition to being an accomplished musician. At the forefront of pioneering new formats for music, she unites tangibility, storytelling & ceremony to albums in this digital age.

Propelify is a celebration and exploration of innovation. Techstars and Samsung NEXT are sponsoring a startup competition (maybe I should apply?!) – and Arianna Huffington is giving the keynote address. Plus, a drone race. Which personally I can’t wait to see.

Secretly I just want to be a drone racer.

Some really neat startups semi-finalists being showcased in the Startup Competition. Like this VR company –

LyraVR is a virtual reality platform that lets you compose, perform and share musical compositions in 3D. Create loops, hand place and tune your sounds in space, press play, and enjoy as your musical masterpieces come to life around you.

and this one –

Geopipe builds immersive virtual copies of the world, built by algorithms, for architecture, real estate, and beyond. Their algorithms build immersive virtual models of the real world to provide visualizations for architects, urban planners, and many others.

So many potential directions for that technology to go in! Can’t wait until I can walk through the any streets in the world with VR, without the bad buggies I always get when traveling.

*Then* on Sunday (May 21), I’m part of a panel discussing the future of Entertainment and VR at Creative Tech week. Some great co-panelists! I’m in esteemed company.

Victoria Pike is a theatrical designer and director who utilizes projection design and mixed reality technologies to create unique theatrical performances and installations. Her background in the theater, designing immersive experiences, has lead her to exploring 360 video and virtual reality as a new space for dramatic storytelling.

Joel Douek is an award-winning composer and instrumentalist whose music has underscored many films and television productions, including in some of the most prestigious documentaries of the last few years – those of naturalist Sir David Attenborough. From big-screen IMAX features such as the BAFTA-winning film ‘Flying Monsters 3D’ and the Everest adventure: ‘The Wildest Dream’ (feat. Liam Neeson, Ralph Fiennes), dark thrillers such as “Manhattan Night” (feat. Adrien Brody, Yvonne Strahovski) and “The Tall Man” (feat. Jessica Biel), Douek’s music has brought many a scene to life.

David Lobser is an internationally recognized, award winning animation director. He uses algorithmic, procedural generation techniques to create lush virtual worlds. He is interested in pushing computer animated effects into the realm of the intricate, messy and imperfect in order to articulate complex feelings and sensations. He has an extensive background in commercial animation and visual effects, has taught animation at Harvard University, and is presently the senior artist in residence at NYU’s Media Research Lab.

Jenya Lugina is a Creative with a deep understanding of technology. He constantly explores the newest developments and best ways to use them to push artistic boundaries and devise innovative solutions for clients such as Pfizer, Merck, Def Jam, HAVAS, Gold Crest Films, Cessna, Nissan, Fuji, Sony and Hasbro. He has worked as a Technology Consultant as well as a Creative Director, creating content for video, web, and interactive installations, using tools that include stereoscopy, autostereoscopy, and virtual reality to produce new ways to communicate messages.

Cortney Harding is a professor, author, and consultant working at the intersection of music and virtual reality. Harding works with technology companies to partner with music artists and labels to create immersive, groundbreaking virtual reality content. Her knowledge of both the music and technology industries position her to uniquely create experiences that move both industries forward.

Harding is the author of “How We Listen Now: Essays and Conversations About Music and Technology”, published in January 2016 and available here. Harding is a professor at the Clive Davis School of Music at NYU. Harding has been a frequent speaker at conferences like SXSW, Further Future, Canadian Music Week, SF Music Tech, and the Right Tech Summit.

Location: NYIT Auditorium on Broadway 1871 Broadway NY NY 10023

360 Video Virtual Reality

This is actually quite momentous, and something I’ve been musing about (how it would work) for a while. High def *video* capture – NOT CGI, not a 3d model, but something that you can experience in virtual reality space as if you were standing there in the real world.

This is a critical shift, eliminating the need for the artificial creation of worlds/experience. Can you interact with it (touch anything)? Probably not. But it will come.

I knew there were companies working on taking moving pictures and interpolating the 3D models out of them (required to move around things, and have them shift as you move your POV), am finally seeing some of it coming to fruition. I’m not actually sure that is the case here (that it’s not just smart faking of 3d perspective), but in order to develop a world where you can interact with things they need to have physical definition – otherwise you won’t be able to touch them, pick them up etc. So you need to not only record a scene, but interpolate the depth of objects and spaces, map that to a wireframe (3d talk for…an object) AND THEN give the user (visitor?!) the ability to move around the scene. That is a *lot* of computing power.

The road is long and steep to get this to be in market, but can you imagine what will happen when this technology is merged with all those 360 camera captures out there? You can literally experience or relive what someone else is doing. Literally travel the world without leaving your comfy Barca Lounger.

Now we only need some smart company to start engineering “Smell-O-Vision” so you can really experience what it’s like to walk around Jaipur (I only pick on Jaipur because I’ve been and can say, you don’t want Smell-O-Vision).

The interesting bit about this isn’t that there are realistic created 3d VR experiences, but that creating them just got one step closer to being easier – and more real. Creating an immersive experience in VR isn’t as easy as it sounds; you know that suspension of disbelief you have to have while watching a movie? And how when one thing is off – say a science fact (umm, maybe that’s just me lol) – you are immediately pulled out of the experience? VR has that even worse. We have a lifetime of experiences in the real world to check against, so any tiny little thing that’s off when you’re there –  say, a shadow not being right – and you’ll be pulled out of the realism. Being able to use 360 footage to do it goes a long way to solving those kinds of problems.

I have a friend who is about to leave the city he’s lived in for 20 years, to move to New Zealand. It’s a long distance and he probably won’t get back to there often – but with this, he could create his own 360 video, and after he moves relive his favorite walks as if he were there, whenever he feels like it. And some day, he can be joined by his long distance daughter virtually, and they can walk together.

Isn’t that a wonderful way VR will add to people’s lives?

Reality, Virtually Hackathon

So stoked….I am going up to MIT Media Lab‘s all day workshop this Saturday, to learn about programming in Augmented and Virtual Reality as part of their Reality, Virtually Hackathon…while I freely admit that a portion of the nitty gritty programming will undoubtedly be over my head, I’m going to get a crash course and overview of the essential process, by all the companies who are the big players in the space.

I’m well chuffed, as they say in the UK.

Companies presenting include Unity, the programming language used to create both Augmented Reality, and Virtual Reality; Microsoft – who is involved because of the Hololens; Google’s Tango, which is technology that helps devices understand where they are spatially, and in the world , and others.

Here’s the full agenda. Don’t fall asleep 😉

Swimming with the fishes

Using the HTC Vive: note, this is not me.
Using the HTC Vive: note, this is not me.

Went exploring an underwater shipwreck with an HTC Vive tonight, complete with schools of fish, sun rays through the water, jellyfish and a huge whale swimming up to me. Was a full room VR demo – I had an 8×8 space to walk around in. What fun! It felt amazingly real from the get go – and boy did the “real” room seem drab after being submerged in a hyper colored world.

The sunlight piercing the water above me was perfectly rendered through virtual waves – it really was just like being about 50 feet underwater, standing on the deck of a sunken ship.

I had fun trying to “poke” the jellyfish that was swimming a little too close to my head for comfort (you *know* it’s virtual…but it’s hard to remember) and it backed off from my hand every time. I’m sure that’s how the developers are dealing with the fact that although I can see it, and walk through it, there’s no real physics going on: I can’t feel any interaction. Although, as per one of my previous posts…there are a multitude of companies working creating physical weight surface interaction in VR.

When VR does get to the point where you are able to reproduce physical interaction, I’m not sure why you’d want to leave to be honest. The world is prettier, brighter, programmable – much better than reality. It feels remarkably similar to what I felt like after watching avatar back in 2011, only better.

No pics of me flailing about in public with the headset on. Probably for the best.

Would love to try this game. I’m not much of a gamer but I think I could be convinced with Virtual Reality. It looks eminently hyper realistic, and if even close to the experience I had with the fishes – a joy to get lost in.