Is Projection Mapping Augmented Reality?

Is projection mapping augmented reality?

I\’ve been mulling this over on and off since a few years ago, since being utterly mesmerized by Amon Tobin\’s projection mapping concert.  And since this made the rounds in 2015. My knee jerk reaction is – no!

Upon further reflection, I\’m not so sure. It augments the real world with enhanced data; so why not? Does augmented reality have to mean enhanced information, or does it include *any* data overlay (even just the pretty kind) – and so what if the digital being overlaid, maps to the surface and doesn\’t extend it? Does it require that the augmented data only be seen through a screen?

This spectacular example of  projection mapping brought the issue back in my mind.

The Houston Museum of Natural Science\’s Energy City installation in Wiess Energy Hall spent 2 years developing this 2,500 square feet cutting edge projection mapping installation, using 32 projectors, 11 media servers, and 6 kilometers of fiber. Called \”Energy City\”, it\’s a 3D miniature landscape representing the city of Houston Texas. Custom content is synced with physical animations to bring the city to life.

Because we\’re in the beginning of all of this, there\’s a lot of discussion {argument} over semantics. Ref: Kevin Kelly\’s recent Mirrorworld article for Wired answered in short order by Ori Inbar\’s \”Mirrorworld v. AR Cloud or: How I Learned to Stop Worrying and Love the Spatial Future\” – where he discusses using the term \”Mirrorworld\” as that means, reflection, rather than enhancement. AR/VR/XR/MR – what do we *call* it all is a heated discussion right now. While I\’m staying out of that one (although for the record\’s sake, I prefer \”Mixed reality\” for it all) – I can live with Projection Mapping being considered AR.

A minor point, but one worth chewing on, if only briefly. Does it really matter? Will this particular rose smell as sweet with any of those names? – I think it matters, in that it makes the whole already confusing to the mass market subject, that more confusing. Does it really matter that VR is immersive and AR overlays onto the physical world, when those boundaries are blurring? Are we parsing too properly, and missing sight of the bigger picture? – which is, that whatever you call it, the AR/VR/MR/XR industry needs to grow.

Slightly uncanny valley

Wow. So. Uncanny valley…one {really bad} selfie and the Pinscreen app maps my face to a 3d avatar from their library (nb: all the female avatars are ridiculously uber sexed up – no, that\’s not my body).

It follows what my face is doing as well –  with \”AR\” mode my face was mapped to my boyfriend\’s body, with the real room in the background. Amazing how quickly this is all developing; that\’s 3 examples in the last few weeks of technology that can quickly, and sometimes – on the fly, create 3d avatars from existing faces and instantaneously apply realistic, real time motion.

PS you\’d think with all this technology, they\’d at least also have a \”beauty filter\” button.

PPS: If anyone\’s interested in Pinscreen\’s fascinating paGAN technology for photorealistic 3D avatar synthesis from a single picture, this is their video \”Deep Learning-Based Photoreal Avatars\” that they presented at SIGGRAPH Asia 2018 in Tokyo.

[wpvideo wu4HhfVZ]

Face Swapping: Deepfakes

\"\"Face-swapping celebrity faces onto porn performers’ bodies (\”Deepfakes\”). It\’s a thing

Yes, it\’s about porn….but it\’s not: if Photoshop has played a major role in bending \”reality\” the point where no one believes a photo any more, just wait until the same \”bending reality\” happens easily, with video. How will anyone know what\’s \”real\” ??

Will there be clipart galleries, just waiting for faces to be superimposed on them?

What about superimposing faces on bodies in VR? – wonder if actors will make deals with entertainment producers to license their faces into VR/AR content, not at the studio level (that\’s a big \”duh\” – is cheaper and less hassle than actually dealing with a live person) but as something fans can pay to use in their own content (dare I say it, fantasies?).

I did talk a little bit about this back in 2011. It is obviously, an inevitability – a natural progression of visual manipulation. But are we ready for this? Legally? Ethically?

 

 

Upcoming appearances

Am honored to be on stage at two events this week! Very excited.

First up, I\’m at the innovation festival Propelify on Thursday, where I will be fireside chatting with Beatie Wolfe  for 25 minutes about music\’s interactive future (South stage, 2:20 pm).

Beatie is a musical innovator in addition to being an accomplished musician. At the forefront of pioneering new formats for music, she unites tangibility, storytelling & ceremony to albums in this digital age.

Propelify is a celebration and exploration of innovation. Techstars and Samsung NEXT are sponsoring a startup competition (maybe I should apply?!) – and Arianna Huffington is giving the keynote address. Plus, a drone race. Which personally I can\’t wait to see.

Secretly I just want to be a drone racer.

Some really neat startups semi-finalists being showcased in the Startup Competition. Like this VR company –

LyraVR is a virtual reality platform that lets you compose, perform and share musical compositions in 3D. Create loops, hand place and tune your sounds in space, press play, and enjoy as your musical masterpieces come to life around you.

and this one –

Geopipe builds immersive virtual copies of the world, built by algorithms, for architecture, real estate, and beyond. Their algorithms build immersive virtual models of the real world to provide visualizations for architects, urban planners, and many others.

So many potential directions for that technology to go in! Can\’t wait until I can walk through the any streets in the world with VR, without the bad buggies I always get when traveling.

\"\"*Then* on Sunday (May 21), I\’m part of a panel discussing the future of Entertainment and VR at Creative Tech week. Some great co-panelists! I\’m in esteemed company.

Victoria Pike is a theatrical designer and director who utilizes projection design and mixed reality technologies to create unique theatrical performances and installations. Her background in the theater, designing immersive experiences, has lead her to exploring 360 video and virtual reality as a new space for dramatic storytelling.

Joel Douek is an award-winning composer and instrumentalist whose music has underscored many films and television productions, including in some of the most prestigious documentaries of the last few years – those of naturalist Sir David Attenborough. From big-screen IMAX features such as the BAFTA-winning film ‘Flying Monsters 3D’ and the Everest adventure: ‘The Wildest Dream’ (feat. Liam Neeson, Ralph Fiennes), dark thrillers such as “Manhattan Night” (feat. Adrien Brody, Yvonne Strahovski) and “The Tall Man” (feat. Jessica Biel), Douek’s music has brought many a scene to life.

David Lobser is an internationally recognized, award winning animation director. He uses algorithmic, procedural generation techniques to create lush virtual worlds. He is interested in pushing computer animated effects into the realm of the intricate, messy and imperfect in order to articulate complex feelings and sensations. He has an extensive background in commercial animation and visual effects, has taught animation at Harvard University, and is presently the senior artist in residence at NYU’s Media Research Lab.

Jenya Lugina is a Creative with a deep understanding of technology. He constantly explores the newest developments and best ways to use them to push artistic boundaries and devise innovative solutions for clients such as Pfizer, Merck, Def Jam, HAVAS, Gold Crest Films, Cessna, Nissan, Fuji, Sony and Hasbro. He has worked as a Technology Consultant as well as a Creative Director, creating content for video, web, and interactive installations, using tools that include stereoscopy, autostereoscopy, and virtual reality to produce new ways to communicate messages.

Cortney Harding is a professor, author, and consultant working at the intersection of music and virtual reality. Harding works with technology companies to partner with music artists and labels to create immersive, groundbreaking virtual reality content. Her knowledge of both the music and technology industries position her to uniquely create experiences that move both industries forward.

Harding is the author of \”How We Listen Now: Essays and Conversations About Music and Technology\”, published in January 2016 and available here. Harding is a professor at the Clive Davis School of Music at NYU. Harding has been a frequent speaker at conferences like SXSW, Further Future, Canadian Music Week, SF Music Tech, and the Right Tech Summit.

Location: NYIT Auditorium on Broadway 1871 Broadway NY NY 10023

Starting an AR company!

Excited to announce that after a 20+ year career spent providing strategic direction, thought leadership and operational expertise to global brands, start ups & agencies, I am changing gears…and starting a company in the Augmented Reality space.

Name to still be decided! That might just be the hardest part lol.

The Augmented Reality Platform is comprised of B2B Enterprise Software + a consumer mobile app, targeting the publishing industry.

I have a fair bit of capital promised for the seed round – investors are highly recognized at a global level, and I\’m pulling together an amazing Board of Directors.

This is going to be a fast, big exciting ride – exciting times, and an exciting technology! Really looking forward to this new challenge.

I am looking for a Tech Co-Founder / CTO. And while I can play one on tv, I\’d rather take this ride with an amazing partner whose core competency is doing this stuff and who is excited about being involved with the next wave of technology.

Ideal experience includes:

  • 10 years of Enterprise Software Development Experience
  • Preferably with at least 3-5 in Game Design and Development using Unity or Unreal Engine to build and ship a product
  • 1-5 Medium to Large Projects taken from Requirements Gathering – Commercialization
  • Familiarity with AR application development
  • Familiarity with Adobe Creative Suite or Autodesk Suite
  • Familiarity with Android and iOS Development
  • Familiarity with Appstore and/or Google Play integration

This is not a gaming company but experience in game development is a requirement.

Compensation for the seed phase – which should last about 10 months – will be both salary (same as me) + equity…remember it\’s a seed stage startup. I\’d love to have someone in New York (close by), but am really just looking for the right partner.

More info for the right candidate(s).

Reach out if you\’re interested to CTO@decahedralist.com.

Reality, Virtually Hackathon

\"\"So stoked….I am going up to MIT Media Lab\’s all day workshop this Saturday, to learn about programming in Augmented and Virtual Reality as part of their Reality, Virtually Hackathon…while I freely admit that a portion of the nitty gritty programming will undoubtedly be over my head, I\’m going to get a crash course and overview of the essential process, by all the companies who are the big players in the space.

I\’m well chuffed, as they say in the UK.

Companies presenting include Unity, the programming language used to create both Augmented Reality, and Virtual Reality; Microsoft – who is involved because of the Hololens; Google\’s Tango, which is technology that helps devices understand where they are spatially, and in the world , and others.

Here\’s the full agenda. Don\’t fall asleep 😉

Reality Virtually

Augmented Reality is projected to be a $120 billion market by 2020 in the US alone; I\’m looking at starting a company there next. Fascinating technology with a ton of potential applications, far beyond mere gaming. It\’s advantage is that it overlays digital onto the real world, vs having to be completely immersed in one as Virtual Reality is, so it can be used throughout the day and in many natural environments – you don\’t have to choose when to use it.

Harvard Business Review has a short article just published about the Mainstreaming of AR…it has been around since 1968, but 2016 is when it\’s starting to take off because of hardware.

AR is less sexy than virtual reality, but has more potential for growth IMO because 1) you don\’t need a lot of hardware/gear for it 2) you don\’t need to have a dedicated space for it 3) people aren\’t getting sick from using it (although I have no doubts that will be remedied) and 4) you don\’t need to immerse yourself in it completely, shutting out the world. Although I do seem to recall people said much the same about television when it launched (it will \”never take off\” since people have to sit and watch it, not doing anything else).

So much for predictions and futurists.

I\’m going up to Boston to take part in MIT Media Lab\’s Reality Virtually hackathon this weekend – we\’ll see what that\’s like; hoping to meet people, network, and get a real sense for what\’s happening out there.

 

Augmented reality: A primer

I recently had a friend who is a professional artist ask me to explain what the tech behind the PokemonGo phenomenon is, and how he can use it in his work. I\’m pasting a bit of my response to him for those who are curious. Augmented Reality (AR) is a fascinating technology, one that\’s actually been around for a while – but PokemonGo has brought it into the mainstream.

\"maxresdefault\"What is augmented reality?

A way to superimpose something created over the “real” world. It adds to reality, rather than replaces it. Some call it a mashup – a combination of two or more functional elements + data to create something new and different.

It is usually visual, but can be aural or even touch. Currently phones are used as the interface (looking at the world through the phone screen to see the overlay), but in the near enough future eye glasses, car windshields, etc will be able to do it as well so the interaction isn’t so artificial (or hazardous).

The difference between augmented reality and virtual reality is AR continues to let you interact with the real world, while VR immerses you in a totally different one.

Here’s a good video that explains it (from 2007!)

How does it work?

There are different ways it can be triggered:

  • Using your phone’s sensors: this is Pokemon Go uses, it “places” objects at geo coordinates and people “find” them. The sensors it can use includes digital compasses, accelerometers, GPS, radar, etc to sense objects in the real world and draw the virtual data in relation to them. These are most often used in vehicles and mobile phones.
  • Using a pre determined visual trigger: This is where the phone recognizes the trigger and superimposes the artificial over it. Sometimes it uses bar codes printed on objects. To do this the phone needs to analyze the video feed from a camera. It then uses the marker as a reference point onto which to draw the virtual data over the real objects. A good example is this:
  • Markerless: This is a more complex variant of the simpler bar code system, where the computer is programmed to detect markers that are not as specific. This could potentially be any feature in the real world as long as it is visually identifiable enough (street signs are one). The complexity of detecting these features is greater, but has the capability to be more robust.

The content that you’re overlaying doesn’t necessarily have to be 3d content: it could trigger a video, for example. This was used in an ad campaign for the Stella Artois Beerfinder App (see examples below)

How to create it?

You need to create an app. The predominant programming language is one that’s used in gaming, called Unity 3D and a framework to run it on, which seems to predominantly be Qualcomm’s Vuforia (the alternative, with different strengths, is Layar, although it’s not mentioned much). The third competitor – Metaio – was recently bought by Apple and doesn’t seem to be public anymore.

Some good AR examples:

Here are six different examples of different AR experiences.

1. Fixed Experience: Desktop Computer – DeBeers

This AR application helps people shop by visualizing an item \”on\” at home, without having to go to a store.

This example works by printing out a sheet of paper with a graphic on it. This graphic is called a \”marker\”. She then holds up the paper so the computer\’s camera can see the paper.

\"De-Beers-Forevermark-Virtual-Try-on-Millemoi\"Her browser accesses the image from the camera, identifies the marker, checks a database for the correct animation.  Her web browser overlays an animation on top of the graphic and the user can move it around,  spin it like a 3D object by moving the paper or using the keys of their keyboard

De Beers used this in their \”forevermark\” campaign (link here).

2. Mobile Experience: Stella Artois

This AR application helps people find nearby restaurants and bars that serve the product you\’re looking for.

This example works by downloading the Stella Artois Beerfinder App to a smartphone. As someone holds up the phone – it accesses the phone\’s camera and GPS/Maps functions, as well as mobile data. As he turns around he can see streets in front of him through the phone screen.

The app connects to a database and shares the loci (gps coordinates and direction the camera is facing) and gets back loci data in response (addresses/storefront info for stores selling the beer. As he looks through the screen, the app overlays an icon on restaurants and bars that sell Stella Artois Beer.

It\’s a lot faster than googling \”who serves Stella Artois near me\” and hoping to find an answer.

3. Mobile Experience: Pokemon GO

PokemonGo is currently all the rage. It\’s not the first AR game, but it is the most successful to date. It\’s the ultimate scavenger hunt! – and it\’s inadvertently getting people into shape lol.

This is how it works.

A woman downloads Pokemon Go game app and creates an account. The game accesses her camera and GPS and mobile data; as she plays the game, she \”finds\” Pokemon tracks superimposed on real world locations – and follows them

\"Pikachu\"

To catch a Pokemon she has to hold up her phone so the camera can show where the Pokemon is; now the app is capturing camera data as well as GPS and direction the camera is pointing in. It \”shows\” a Pokemon Character standing on the path of a park or on a bench in a museum. She then captures the Pokemon by \”throwing\” a ball at it on the screen and earns points, which she shares the points team (other players aligned with a certain team).

4. Mobile Experience: Zombie Run (Audio Only)

This AR uses only audio to superimpose a game onto the real world.

The Zombie Run app connects to mobile data, then uses the phone\’s GPS and accelerometer data to play sounds to the person playing. When he turns on the game – which is audio-only – he starts running and are given both directions (thanks to GPS) as well as running speed data. The directions they hear push user to \”find\” caches of supplies.

At determined intervals the game will play sounds and give commands that the player is being chased by Zombies, which they can outrun if they speed up.

I don\’t actually know what happens when they catch you, but I suspect it sounds grisly.

5. Mobile Experience: Glassholes

\"Terminator-Version-app-pop_10766\"AR that helps recognize and identify people, overlaying information about them on the \’real\” world with the help of Heads Up Display (HUD) (a la Google Glass, or tech enabled eyeglasses).

The Google Glass has a built-in camera, mobile data connectivity and a transparent display in the field of view.

As a person walks through a crowd or a party, the Google Glasses are capturing pictures of people\’s faces at the party, instantly processing that info using facial recognition and comparing it to data source such as LinkedIn, then providing an overlay to the wearer of data such as their name, company they work for, and other useful stats pulled from various social media and other sources.

6: Mobile Experience: Word Lens (now integrated with Google Translate)

This is a very simple AR application, but a great one: triggered by signage, it overlays whatever language you prefer over the sign through your phone screen (although I\’m waiting for it to be integrated into your car windshield, so when driving in a foreign country you will \”see\” the signs in your preferred language).

So, that\’s a basic introduction to Augmented Reality. Any other great examples I\’ve missed? Tell me in the comments!

https://gfycat.com/ifr/LikableHandsomeIndianpangolin%20

Scroll to Top