Augmented Reality Symposium at the Inspire Institute (University of Canberra) Thursday

Notes from Rob Manson presentation:

  • Rob Manson is an AR expert at the Inspire Institute at UoCanberra. See
  • Azuma’s definitions of AR (spectrum I used in my blended reality paper)
  • Billinghurst released the first ARToolkit released in 1999
  • FLARToolkit released in 2009 (took ARToolkit into Flash) – still marker based, but enabled  AR to reach a much larger audience
  • ARStandards Workshop in Seoul 2010 presented a framework for web-based AR.
  • JSARToolkit released in 2011 – Javascript version (works well)
  • See Google’s Project Glass video. See , ,
  • See Thad Starner and Augmented Cognition who has recorded all his conversations etc through a camera for about 20 years
  • “Pervasive Computing”
  • See for the latest AR approaches.

Stephan Barrass

Hawker College iphone app

  • Student programmed school app with QR readers for events, assignments, attendance,
  • [Student, Josh, presented the app and streamed his iphone to appear within his presentation!!]
  • Used Aurasma as the Augmented Reality platform
  • Propsectus contains augmented reality videos that embed – turns the text and images into a multimedia documents
  • Possibilities for not only having teachers create augmented reality environments for their students, but also having students as creators of augmented reality environments
  • Student had little programming experience but still programmed the Hawker School App in about 6 weeks.

Graham Cassells (Copland College)

  • ICT Curriculum should be project based and not content based. For instance, using Unity3D to have students as designers of games. So The syllabus needs to provide flexibility to teach content that doesn’t yet exist – eg AR.
  • Students are teaching themselves how to create AR using Youtube – doesn’t require much understanding.
  • It was suggested that could even be used at Primary school level.
  • They use Unity and String as their platforms.
  • Working towards incorporating sound, touch, etc
  • See for the different possibilities.

After lunch Aurasma session with Matt

  • Took a video on the ipad and then used Aurasma App to create augmented reality channel and AR overlays.
  • Aurasma studio online can get people up and running with AR (
  • Can also register for a developer account at Aurasma, allowing people to do everything that they was being done on an ipad. So could take photos and upload to the web account using a browser (note that photos are better than scans because it captures the lighting conditions).
  • If you had a statue and wanted to play a video then you could use a geolocation trigger rather than multiple marker (image) triggers, that way the video would not be triggered for each image.
  • Can also take greenscreen videos and then save out as flv so that your image could play out in location when triggered.
  • Web Aurasma enables easier management of resources and channels.
  • Web Aurasma account also enables to save as an App for the itunes store or Android Apps store, so that you could create an excursion app for your school – can essentially do for free (except for itunes store upload fee, but free for Android Apps store).
  • Web Aurasma allows creation of different buttons to press in order to get different information.
  • Can also do deeper level coding using Xcode or Objective C.

My ideas:

About matthewbower

Professor at Macquarie University.
This entry was posted in Uncategorized. Bookmark the permalink.