Using Geospatial API for Augmented Reality Maps
Aug 31, 2022
I have showed you before how I used augmented reality technology to provide indoor maps for our tech conference venue (post, followup post) or for some fun experimentation with virtual physics in form of an augmented reality game.
I also showed you several dedicated 360 VR websites, for example, see a sample list of them in the Google VR View post. The latest 360 VR project was about sustainability: I wanted to help to map out recycling trash can locations for the California State University Fresno Sustainability Club. The sustainability club ordered stickers to stamp the trash cans and I created a 360 website to show locations around campus. When the campus was deserted during COVID I went out one weekend day to shoot 360 photos and videos of the trash cans. The currently mapped spots are towards the North-East section of the campus clustering towards the science building and areas.
The 360 website is fun, especially with a VR headset, however, I envisioned a much more usable scenario I described to the interested club members and also a fellow geek-minded developer Seth Nuzum when I visited the Meta Store opening. Although the website could still guide a student without a headset, it requires several clicks and the whole experience could be more immersive. The best would be to provide a similar augmented reality map to the Bitwise Industries South Stadium conference venue’s AR map. That app is for indoor use only and the distances on campus would be large enough that even a small directional drift could displace the pins significantly. Such drift accumulates as the phone is moving and shaking in the user’s hand. On top of this, the activation of an indoor map requires a specially crafted image and I would need to stamp up each trash can with the unique activation photo, along with a guarantee that the photo orientation would not change. Unfortunately, that is infeasible: the trash cans are moved sometimes, thus the activation image’s orientation could change and the pins could end up all over the place.
However, you may notice that all of the recycling trash cans are outdoors. We know their GPS coordinates and when a student is looking for them on campus they are also outside. Why couldn’t we take advantage of this and make an outdoor augmented reality map? Before the 2022 Google I/O there were multiple blockers for this:
- In the past an augmented reality anchor could not be tied to a GPS coordinate the way we want.
- We do not know our precise GPS location and more importantly our orientation.
I’ve thought about this a lot: it might be possible to create our wrapper Anchor class with a GPS with some extra back-end. A-GPS technology and lately the dual-band GPS chips in modern phones allow relatively precise GPS triangulation. What I was not able to solve properly is the orientation of the student and the phone. Compass sensors usually require a weird figure eight shape calibration procedure, I cannot feasibly ask for that. Even if I ask a user to walk two yards in a straight line, I would get some kind of a direction, but then regular hand movements would still mess that up big time.
In 2020 Fall an end of a blog post and a teaser video suggested that Google was heavily working on a solution related to this problem. The naming “Earth Cloud Anchors” hinted at the fact that the augmented reality technology / team collaborates with the Google Earth team / technologies. It is more than that, these technologies are all involved:
- ARCore’s AI algorithms
- Street View data
- Google Earth 3D models
- Google’s Visual Positioning System (VPS)
If your app’s target area involves street view then the positioning and orientation can be even more precise (by leveraging that data and correcting orientation and location). Fortunately, the Fresno State campus has street view data, although I must note that street view is just an optional thing. I implemented an application following a demonstration codelab. The app is released to the Google Play Store already and featured on the Recycling Trahscan 360 website.
I came across two quirks:
- My mid-range OnePlus Nord phone started to stutter after the app was running for more than a minute in a 100F+ degree heat.
- The framework’s perceived elevation is about 30 yards lower than the supposed real elevation. The first time I experimented with the app I could not find my test pins in my apartment complex. I noticed the perceived elevation and then I found the pins hovering in the air.
I was contacted by another developer Darryl Bartlett who experienced similar problems. I tried multiple workarounds for my use case:
- The first workaround was to determine the perceived elevation at the time of activation. I made sure the user was close enough to the closest pin, and I assumed that the user is at street level, possibly roughly a similar level to the pins. Then I would calculate an offset based on this and offset the elevation of all pins before I place them into the AR scene.
- Then I also noticed that the perceived elevation is offset the same at my apartment complex and the university campus. For the second workaround, I assumed that the framework just used some kind of a different elevation dataset than Google Maps, Google Earth, and other common elevation calculation services. Besides that, I assumed that this unknown dataset yields a reproducible offset. So I ended up simply hard coding the offset - which made the source code a little simpler.
Fortunately, Google engineers must have been working on a solution already, because at the end of August Darryl Bartlett drew my attention to the Terrain Anchors. This is an API that does not take elevation (only a supplied hovering modifier). I successfully tested this new API and it looks like it is working. The perceived elevation is still off by about 30 meters, however now as I let the framework itself calculate the elevation based on the terrain it matches up with the world. I have not tested corner cases so far such as placing a map pin below a tree canopy (a tree canopy bumps the elevation number on Google Earth).
I summarized my journey in a lightning talk not so long ago at the Pacific Region Google I/O Extended Web Edition, here is my slide deck, and here is a video recording of it on the North America GDG Youtube channel. Let me know if you are also working on any Virtual Reality, 360, or Augmented Reality projects.