#Business #Featured

How Google Maps is making real-time business busyness info more accessible

While Google Maps’ popular times and live busyness insights have been saving us from unnecessary stress and helping us make better plans since 2016, this information has transformed into an essential tool for many during the ongoing COVID-19 pandemic.

Google says between March and May this year, users’ engagement with these features increased by 50 percent as more people compared data to find the best ‘social distancing-friendly’ days and times to visit a store. This is why Google is now taking steps to do away with all the extra tapping and scrolling that is typically required to check the live busyness index of a place.

As you can see in the image here, live busyness information of your destination will be displayed directly on the screen when you’re getting directions on Google Maps.

Google is also on track to increase global coverage of live busyness data by five times compared to June 2020. This expansion includes more outdoor areas, such as beaches and parks, and essential places, like grocery stores, gas stations, laundromats, and pharmacies.

Understanding that the pandemic has brought about many changes to businesses’ offerings, Google has also been leveraging AI tech to ensure that the information you see on Google Maps is up-to-date. Google Duplex, a system that enables people to have a natural conversation with computers, has been calling businesses and verifying their updated hours of operation, delivery and pickup options, and store inventory information for in-demand products such as face masks, hand sanitizer, and disinfectant.

Google says this technology has enabled more than 3 million updates to Maps since April 2020. And soon, Google Maps users will also be able to contribute to useful information such as whether a business is taking safety precautions and if customers are required to wear masks and make reservations.

Interestingly, Google Maps’ AR-powered Live View feature is also going to help users get around safely. How? Let’s say you’re walking around a new neighborhood, and one boutique in particular captures your attention. You’ll now be able to use Live View to learn quickly if the place is open, how busy it is, its star rating, and health and safety information, if available. Something like this:

Say thanks for this article (0)
The community is supported by:
Become a sponsor
#Business
#Business #Featured
Global Top 100 Geospatial Companies – 2024 Edition
Avatar for Muthukumar Kumar
Muthukumar Kumar 01.31.2024
AWESOME 10
#Business #Ideas
Urban Digital Twins in China: A Smart Gadget or a Decision Support Tool?
Nianhua Liu 03.28.2024
AWESOME 4
#Business #Featured #People
How can we achieve a sustainable, safer and fairer society for all? With trusted geospatial data, says EuroGeographics
Nikita Marwaha Kraetzig 12.6.2023
AWESOME 1
Next article
#Business #Featured

What’s a LiDAR sensor doing on an iPhone anyway?

For our regular readers and the geospatial community in general, Light Detection and Ranging or LiDAR is part of common parlance. For everyone else, it’s a cool new tech that Apple first incorporated in iPad Pro and has now adapted for its iPhone 12 Pro models.

You will find the sensor integrated into the rear camera cluster of the Pro series. But what exactly is a LiDAR scanner doing on an iPad or iPhone in the first place?

By measuring how long it takes for light to reach an object and reflect it back, a LiDAR sensor provides incredible depth-sensing capabilities. Unlike other depth sensors that perform better in daylight conditions, LiDAR works just as well in low-light environments too. For Apple products, this pans out broadly into two main application areas:

Improved AR experiences

The LiDAR scanner on iPhone and iPad can be used to create incredible AR experiences that interact with real-world objects. Developers can instantly place AR objects in the real world and take advantage of the device’s depth information to create life-like experiences with real-world physics, object occlusion, and lighting effects. Think better home design tools, interactive shopping apps, and immersive gaming.

iPhone 12 Pro and Pro Max users will first see this technology being leveraged in Snapchat’s iOS app. The social networking company plans to launch a multitude of LiDAR-powered Lenses specifically for the new iPhone and iPad.

“The addition of the LiDAR Scanner to iPhone 12 Pro models enables a new level of creativity for augmented reality,” says Eitan Pilipski, Snap’s SVP of Camera Platform, pointing out how LiDAR allows Snapchat’s camera to see a metric scale mesh of the scene, understanding the geometry and meaning of surfaces and objects.

The ongoing global pandemic has already shown us how technologies like AR and 3D scanning can open up a world of opportunities for businesses. In the last 3 months alone, spatial data company Matterport has seen more than 100,000 downloads of its iPhone app that can be used to create virtual walkthroughs of 3D spaces.

“Organizations opened virtual museum tours globally, enabled insurance and restoration claims to continue, as well as allowed apartments, offices, and warehouses to continue to be rented,” explains Robin Daniels, CMO, Matterport. “With the announcement of the iPhone 12 Pros with LiDAR, Apple has taken a big step in making powerful 3D sensing hardware mainstream. We couldn’t be more excited about this development and its applications.”

Better low-light photography

Apart from providing more realistic AR experiences, a LiDAR sensor on iPhone 12 Pro models will improve the camera’s autofocus by 6x in low-light scenes. This translates into more accuracy and reduced capture time in photos and videos. Users will also be able to capture Night mode portraits with beautiful low-light bokeh effect, such as the one you see below:

The Pro series is priced at a couple of hundred dollars more than the iPhone 12, which is not a bad bargain for an awesome tech like a LiDAR sensor. What do you think?

Read on
Search