Augmented Reality/Tutorial



Augmented reality (AR), is a live direct or indirect view of a physical, real-world environment whose elements are augmented by computer-generated sensory input such as sound, video, graphics or GPS data. It is related to a more general concept called computer-mediated reality, in which a view of reality is modified (possibly even diminished rather than augmented) by a computer. Augmented reality enhances one’s current perception of reality, whereas in contrast, virtual reality replaces the real world with a simulated one. Augmentation techniques are typically performed in real time and in semantic context with environmental elements, such as overlaying supplemental information like scores over a live video feed of a sporting event.

With the help of advanced AR technology (e.g. adding computer vision and object recognition) the information about the surrounding real world of the user becomes interactive and digitally manipulable. Information about the environment and its objects is overlaid on the real world. This information can be virtual   or real, e.g. seeing other real sensed or measured information such as electromagnetic radio waves overlaid in exact alignment with where they actually are in space. Augmented reality brings out the components of the digital world into a person's perceived real world. One example is an AR Helmet for construction workers which display information about the construction sites. The first functional AR systems that provided immersive mixed reality experiences for users were invented in the early 1990s, starting with the Virtual Fixtures system developed at the U.S. Air Force's Armstrong Labs in 1992.

Hardware
Hardware components for augmented reality are: processor, display, sensors and input devices. Modern mobile computing devices like smartphones and tablet computers contain these elements which often include a camera and MEMS sensors such as accelerometer, GPS, and solid state compass, making them suitable AR platforms.

Location based Augemnted Reality
MixARE (mix Augmented Reality Engine) was one of the first open-source software (GPLv3), that supports augmented reality on Mobile Devices (Android and iPhone). Location based augemnted reality (like MixARE uses the camera image of the mobile device and the sensor of the smartphone to display icons in the camera image. The application can be used for displaying context dependent information, that makes reference to the current geolocation (2010). The developement of webbased technologies and the improved performance of mobile devices allows the implementation location based MixARE like infrastructure with HTML5 application based on Javascript. The key element for web-based location based Augemented reality was based GeoAR and AR.js as OpenSource technologies.

Marker based and location based AR with AR.js
Augmented Reality in Javascript with AR.js (see Youtube Example Tutorial) with the Open Source Github Repository of AR.js and learn about creating your augmented reality web application or start with WebAR Playground.

See AFrame at Work

 * AFrame Examples from OpenSource Portal (some examples can be viewed only with Browser Chromium/Chrome - some work with Firefox). With AFrame you create and test the web based Virtual Reality (WebVR) objects that can be used as 3D models in AR.js

See AR.js at Work

 * Youtube-Video of Basic Features by Jerome Etienne (2017/02/23)
 * AR.js with AR Image Tracking, Location based AR

See TrackingJS at Work
The OpenSource javascript library tracking.js supports augemented reality without GPS, Accelerometer and Device Orientation. It is mainly based on object detection in the camera image. Recognized objects can be replaced by 3D-objects, videos, images and even specific movements with you hand can trigger augemented audio samples (e.g. augmented bouncing ball, that creates a sound every time the ball touches the real ground in the camera image.
 * Analyse how users can interact with the implemented tracking.js feature and how digital objects can be manipulated with movement of hands.
 * Create a software design in which handicapped people that are unable to type can interact with a computer and browser based applications and compare these options with Open Source Simon KDE or PocketSphinxJS OpenSource Library for webbased speech recognition with submission the audio recording to TwoogleBook et. al.
 * Handcapped people might not be able to type. Analyse Open Source Speech Recognition for handicapped people in comparision to Human-Computer Interaction with Movement Detection

See Mixare at Work
Watch the Mixar Demo video about the application of MixARE in Vienna Austria to understand, how digital information is placed in the real-life camera image of a smartphone by computer-generated information related to geolocation is currently. Compare the similarities and differences to AR.js.

Create your own Mixare Database
The following part of the tutotial support you in creating your own Mixare Database.
 * 1) Select an area in your hometown e.g. the zoo in your city.
 * 2) Attach digital information to real geolocation. To do this you need to search for
 * 3) * wikipedia information e.g. articles about the animals you can observe in your zoo or
 * 4) * look for Youtube videos (or existing videos used in Wikiversity) that show the animals in their natural environment. This media or learning resources are attached to a real geolocation.
 * 5) select the geolocation for digital learning resources e.g. in your zoo where your can watch the real animals and augment the real experience with your selected digital learning resources. Create Mixare Database with Mixare4JSON editor. This Mixare Database can be used as your own database. Now the created Mixare database can be stored  on web server of your choice. This can be done with two ways:
 * 6) Storage Webserver: Select a storage location on a webserver for your Mixare JSON database created with Mixare4JSON editor.
 * 7) * ask the adminitrator of web server of the zoo if you want to share your work on the web page of zoo and place a link to Mixare to the website as well. Other zoo visitors can use your contribution of the Mixare database and maybe they adminitration of zoo supports your work in future if they like your augmented reality approach.
 * 8) * if you cannot use an existing web server of your Mixare database create with the Mixare4JSON editor. Create a GitHub account and create a GitHub Repository e.g. MixareDB4MyZoo . Create a subdirectory docs/ in your github-Respository with and index.html which a download link for your JSON Mixare database. Learn to populate your GitHub repository.
 * 9) * Press a menu button on Android phone to select you own source for mixare. Select data source of JSON file.