Live blog of the AR and VR Meetup in Sydney

Scott O’Brien kicks us off with a welcome to country and a note that the AR Meetup group is growing rapidly – now over 600 members. Ric Holland from eClub Offis solutions – the main sponsor of the AR events – talks about the Disrupted Xmas party coming up on the 3rd December.

Scott O’Brien is back talking about what is happening in VR and AR. He talks of an event recently where a 360 rig was set up and the Sydney Dance Company developed choreography to engage a central audience member. He also talks about the active AR audience and the forthcoming interest in AR in Hollywood.
________________________________________

First demo: IndiVisual Films

The Flying Elephant demo is up and Barbara Harvey talks about the AR children’s series based on Susanne Gervay’s Elephants Have Wings. Harvey tells us in rhyming verse of the vision of the project – the transmedia history of the development of the AR experience of the story.

Love the fact that software is described as art in this session. We’re the creator and the created.

The AR experience is activated by the book itself. Music and visuals match the story, narrated by the author. This technology can be used with any children’s book, and can help build the connection with stories. This is brilliant for children who learn in different ways.

Suggestions from the crowd include developing a frame for holding phones over pages, and using voice recognition to engage further with the content. These are likely in the further development of the product.

Author, Susanne, notes that the technology is fantastic but we need to focus on quality stories to ensure that you are giving a voice to young people.
_____________________________________

Second demo: Heart Science

Two girls, Sophie (12) and Kaya (10), with the support of Dr Jordan Nguyen, talk about why they got to work with VR on their heart disease project for school. They developed a project to show how the blood moves through the arteries. They used Blender to create the arteries and blood cells, then they composited into Unity and programmed it with C# to tell the story of heart disease.

Great work from two young coding stars!

Third demo: Good for Humankind?

Nguyen starts with an imaginative exercise, where we are all briefed to imagine a scene on a cliff. He notes that we can all imagine scenes with little prompting.

He then talks about his family history and the early iterations of AI where robots were taught strategy with games. He then talks about his early days at uni and his desire for developing omnidirectional games. He then tells the story of his near-miss in a diving accident where he nearly broke his neck. He then sought out stories from people with disability and considered how technology has assisted and continues to be developed to provide greater mobility and independence for people with disability.

What is interesting in the way we are interacting with tech is that we are beginning to develop ways to treat things like phantom limb pain.

Nguyen notes that the brain is stimulated by EEG (brain), EOG (eyes), and EMG (limbs) and we can use these signals to treat everything from phobias, to pain tolerance, to risk tolerance. He talks of the phobia treatment that can come from AR and VR simulation.

He then describes virtual surgery and diversionary therapy as a means of training and treating people suffering from burns.

Nguyen discusses the UTS big data arena and the application of data visualisations as a means of better understanding data.

Another application is storytelling where people can see through the eyes of others as an autonomous experience. From refugees stories to the experience of childbirth, education and cultural exchanges, storytelling is key to creative development.

Nguyen talks of his research in mobility and smart wheelchairs for people with locked in syndrome.

Nguyen’s most recent research is back in gaming – he sees the capacity to imagine what you could do with tech as being potentially so enabling for those who need it most. Great stuff!

______________________________________

Fourth demo: David Francis and Computer Vision for Navigation

David introduces us to depth sensing tech. He notes that consumer facing devices like Microsoft Kinect and Nintendo Wii are limited. Then there is tech like Leap Motion where you can use virtual action in this air. And there’s Intel RealSense which does facial tracking and mood tracking to interact differently with the world.

Francis notes that a lot of the great tech that’s out there is being hoovered up by major tech brands – so the Kinect 3D camera has been acquired by Apple. And while they are being developed, it’s all behind closed doors. Fortunately other tech keeps emerging. Francis shows us Spike by Ike – tech that takes measurements from photos. He talks about the augmentation of Google Maps which is now taking depth perception imagery so we can better navigate places and spaces.

He also mentions Project Tango which is being used to map space in real time through your phone. This is designed for indoor navigation. The problem, of course, is that the tech really doesn’t handle shiny surfaces well yet. But now the depth perception stuff is coming in, it’s more feasible.

Francis demonstrates tech that can be used to scan physical objects to be able to duplicate or engage with our environments.

Francis says we are moving to a spatially referenced environment. We need to be sure we understand how to tell stories effectively in this kind of reality.

___________________________________
Lightning Talks – Brandan Harkin from X Media Labs

Harkin has been interviewing people on transmedia storytelling in LA. He says that VR and AR are the only thing in town in the region right now.

Best things Harkin saw were the VR emulation of The Wire film experience, and the Superman emulation. He notes that the difference between LA and SF is that LA is all about storytelling, SF is all about storytelling. Best way to find out what is going on is to listen to A16Z every week, and then add the Lelyveld AR and VR business prop in any pitch. You should also speak in terms of problems you’re solving in SF. Finally he notes there is a lot of money in AR & VR but there are still very few projects out there. Great opportunities on the horizon.

Lightning Talks – Daniel Mack

Mack talks about his development of VR through commercialising immersive military VR. Now have developed tech to improve immersive experiences to reduce VR sickness. He’s also working out how you can move between scenes in an immersive storytelling experience.

Lightning Talks – Landen, Red Cartel

Started working on console games and real time content. Now working on VR and AR projects using Kinect and car simulations.

Lightning Talks – Transmedia Projects

New service allows you to upload an object and Transmedia will create an environment for your object and you can download as a service. For these guys, cheap headsets were the opportunity to develop their projects.

________________________________________

Scott O’Brien notes that we can start to use Innovation Firebug to develop new content. He hopes all present will connect through the meetup group to make great things.

Be Sociable, Share!
This entry was posted in events and tagged . Bookmark the permalink.