I have a dream..to live in a world where I can (for example, just a random one) be on a motorcycle in the pouring rain, motoring through the hairpin mountain curves of Danang, and have that experience be replicated in a 3d environment so others could experience it as I did – rain, wind, hairpin curves, scenery and all.
AND…interact with those objects as if they were solid… because based on changing angles, I think the computer should be able to interpolate 3d shapes. I’m a dreamer.
Beats sharing flat snapshots any day! All those amazing experiences, sharable in an immersive and interactive experience. How incredible would that be?
PS there are people working on this. Intel for one, the Nokia Ozo another..and I’m sure many others.
Am honored to be on stage at two events this week! Very excited.
First up, I’m at the innovation festivalPropelify on Thursday, where I will be fireside chatting with Beatie Wolfe for 25 minutes about music’s interactive future (South stage, 2:20 pm).
Beatie is a musical innovator in addition to being an accomplished musician. At the forefront of pioneering new formats for music, she unites tangibility, storytelling & ceremony to albums in this digital age.
Propelify is a celebration and exploration of innovation. Techstars and Samsung NEXT are sponsoring a startup competition (maybe I should apply?!) – and Arianna Huffington is giving the keynote address. Plus, a drone race. Which personally I can’t wait to see.
Secretly I just want to be a drone racer.
Some really neat startups semi-finalists being showcased in the Startup Competition. Like this VR company –
LyraVR is a virtual reality platform that lets you compose, perform and share musical compositions in 3D. Create loops, hand place and tune your sounds in space, press play, and enjoy as your musical masterpieces come to life around you.
and this one –
Geopipe builds immersive virtual copies of the world, built by algorithms, for architecture, real estate, and beyond. Their algorithms build immersive virtual models of the real world to provide visualizations for architects, urban planners, and many others.
So many potential directions for that technology to go in! Can’t wait until I can walk through the any streets in the world with VR, without the bad buggies I always get when traveling.
*Then* on Sunday (May 21), I’m part of a panel discussing the future of Entertainment and VR at Creative Tech week. Some great co-panelists! I’m in esteemed company.
Victoria Pike is a theatrical designer and director who utilizes projection design and mixed reality technologies to create unique theatrical performances and installations. Her background in the theater, designing immersive experiences, has lead her to exploring 360 video and virtual reality as a new space for dramatic storytelling.
Joel Douek is an award-winning composer and instrumentalist whose music has underscored many films and television productions, including in some of the most prestigious documentaries of the last few years – those of naturalist Sir David Attenborough. From big-screen IMAX features such as the BAFTA-winning film ‘Flying Monsters 3D’ and the Everest adventure: ‘The Wildest Dream’ (feat. Liam Neeson, Ralph Fiennes), dark thrillers such as “Manhattan Night” (feat. Adrien Brody, Yvonne Strahovski) and “The Tall Man” (feat. Jessica Biel), Douek’s music has brought many a scene to life.
David Lobser is an internationally recognized, award winning animation director. He uses algorithmic, procedural generation techniques to create lush virtual worlds. He is interested in pushing computer animated effects into the realm of the intricate, messy and imperfect in order to articulate complex feelings and sensations. He has an extensive background in commercial animation and visual effects, has taught animation at Harvard University, and is presently the senior artist in residence at NYU’s Media Research Lab.
Jenya Lugina is a Creative with a deep understanding of technology. He constantly explores the newest developments and best ways to use them to push artistic boundaries and devise innovative solutions for clients such as Pfizer, Merck, Def Jam, HAVAS, Gold Crest Films, Cessna, Nissan, Fuji, Sony and Hasbro. He has worked as a Technology Consultant as well as a Creative Director, creating content for video, web, and interactive installations, using tools that include stereoscopy, autostereoscopy, and virtual reality to produce new ways to communicate messages.
Cortney Harding is a professor, author, and consultant working at the intersection of music and virtual reality. Harding works with technology companies to partner with music artists and labels to create immersive, groundbreaking virtual reality content. Her knowledge of both the music and technology industries position her to uniquely create experiences that move both industries forward.
Harding is the author of “How We Listen Now: Essays and Conversations About Music and Technology”, published in January 2016 and available here. Harding is a professor at the Clive Davis School of Music at NYU. Harding has been a frequent speaker at conferences like SXSW, Further Future, Canadian Music Week, SF Music Tech, and the Right Tech Summit.
Location: NYIT Auditorium on Broadway 1871 Broadway NY NY 10023
Excited to announce that after a 20+ year career spent providing strategic direction, thought leadership and operational expertise to global brands, start ups & agencies, I am changing gears…and starting a company in the Augmented Reality space.
Name to still be decided! That might just be the hardest part lol.
The Augmented Reality Platform is comprised of B2B Enterprise Software + a consumer mobile app, targeting the publishing industry.
I have a fair bit of capital promised for the seed round – investors are highly recognized at a global level, and I’m pulling together an amazing Board of Directors.
This is going to be a fast, big exciting ride – exciting times, and an exciting technology! Really looking forward to this new challenge.
I am looking for a Tech Co-Founder / CTO. And while I can play one on tv, I’d rather take this ride with an amazing partner whose core competency is doing this stuff and who is excited about being involved with the next wave of technology.
Ideal experience includes:
10 years of Enterprise Software Development Experience
Preferably with at least 3-5 in Game Design and Development using Unity or Unreal Engine to build and ship a product
1-5 Medium to Large Projects taken from Requirements Gathering – Commercialization
Familiarity with AR application development
Familiarity with Adobe Creative Suite or Autodesk Suite
Familiarity with Android and iOS Development
Familiarity with Appstore and/or Google Play integration
This is not a gaming company but experience in game development is a requirement.
Compensation for the seed phase – which should last about 10 months – will be both salary (same as me) + equity…remember it’s a seed stage startup. I’d love to have someone in New York (close by), but am really just looking for the right partner.
More info for the right candidate(s).
Reach out if you’re interested to CTO@decahedralist.com.
This is actually quite momentous, and something I’ve been musing about (how it would work) for a while. High def *video* capture – NOT CGI, not a 3d model, but something that you can experience in virtual reality space as if you were standing there in the real world.
This is a critical shift, eliminating the need for the artificial creation of worlds/experience. Can you interact with it (touch anything)? Probably not. But it will come.
I knew there were companies working on taking moving pictures and interpolating the 3D models out of them (required to move around things, and have them shift as you move your POV), am finally seeing some of it coming to fruition. I’m not actually sure that is the case here (that it’s not just smart faking of 3d perspective), but in order to develop a world where you can interact with things they need to have physical definition – otherwise you won’t be able to touch them, pick them up etc. So you need to not only record a scene, but interpolate the depth of objects and spaces, map that to a wireframe (3d talk for…an object) AND THEN give the user (visitor?!) the ability to move around the scene. That is a *lot* of computing power.
The road is long and steep to get this to be in market, but can you imagine what will happen when this technology is merged with all those 360 camera captures out there? You can literally experience or relive what someone else is doing. Literally travel the world without leaving your comfy Barca Lounger.
Now we only need some smart company to start engineering “Smell-O-Vision” so you can really experience what it’s like to walk around Jaipur (I only pick on Jaipur because I’ve been and can say, you don’t want Smell-O-Vision).
The interesting bit about this isn’t that there are realistic created 3d VR experiences, but that creating them just got one step closer to being easier – and more real. Creating an immersive experience in VR isn’t as easy as it sounds; you know that suspension of disbelief you have to have while watching a movie? And how when one thing is off – say a science fact (umm, maybe that’s just me lol) – you are immediately pulled out of the experience? VR has that even worse. We have a lifetime of experiences in the real world to check against, so any tiny little thing that’s off when you’re there – say, a shadow not being right – and you’ll be pulled out of the realism. Being able to use 360 footage to do it goes a long way to solving those kinds of problems.
I have a friend who is about to leave the city he’s lived in for 20 years, to move to New Zealand. It’s a long distance and he probably won’t get back to there often – but with this, he could create his own 360 video, and after he moves relive his favorite walks as if he were there, whenever he feels like it. And some day, he can be joined by his long distance daughter virtually, and they can walk together.
Isn’t that a wonderful way VR will add to people’s lives?