Augmented Reality (AR) Office App


Augmented Reality Case Study

Type: Android mobile app
Technologies: Android, ARToolkit, DLib, GoogleVision, PostgreSQL+cube, Spring Boot
Duration: 8 months
Team 1 Developer, 1 designer
The client is a US based company in California. The project is a client-server app involving three distinct techniques using augmented reality(AR) visualization, indoor navigation, and face recognition in the office environment of our client.

The app has two main functions. First it allows client’s guests or customers, using their mobile device to scan, uniquely identify employees face and view their information.

The second function involves using the app with Epson Moverio augmented reality smart glasses BT-300. Here the user will have immediate access to client’s completed projects information.

The flow goes like this. The user wares the AR glasses with a high resolution camera and walks through office’s glory alley, where the icons of completed projects are placed.

By looking at the special marker icons, the camera is pointed to the project. Once it’s fixed on the icon or project logo, a colored image of the project name becomes visible to the user. User then taps on the projects and views projects info such as name, location, duration, technologies used, etc.gs and newts) lifecycle.

Challenge

  1. User could not see image directly from the camera when using the app with see-through devices like Epson Glasses and using a projection matrix available by AR Toolkit was not possible.
  2. The sensor data contained additional meaningless information, and we could not use it for long-term position tracking.
  3. The magnometer data was out of line and produced different scale on each axis. This caused data dependency on the device orientation. With another words, when user walked through the same route twice in opposite directions, different values were produced by the magnetometer.

Solution

We decided to use calibration method to derive eye-to-eye camera relations. Next, we obtained the intrinsic camera matrix of an eye in the camera coordinate system by calibrating ten points with known coordinates. This was done for both the left and right eye independently.

the values we derived from magnetic field was used to create a map of our client’s office. So when user walked through route, we traced magnetic field changes to obtain user’s walk through path. We used a custom designed calibration method to obtain correction coefficients.

Results

  1. Users can now see augmented images perfectly aligned with real objects with 20 x 20 vision using see-through devices.
  2. User position can be determined with accuracy of 5 to 6.5 feet.
  3. Our custom designed calibration method produced stable, direction-independent values of the magnetic field.

Areas of our expertise

Augmented Reality
Computer Vision
Neural Networks

Other Technologies

Projectve and Single View Geometry
MATLAB / Octave