Cesium announces a free, open source plugin for the Unity game engine
#GeoDev

Cesium announces a free, open source plugin for the Unity game engine

Since its incorporation back in 2019, Cesium has been at the forefront of enabling developers to build 3D applications of any kind. In the past years, the company has been making inroads into not just 3D applications mainly needed for industrial uses (construction, automotive, simulation, etc) but also in the world of gaming. Cesium’s biggest announcement yet on the gaming front was made back in 2021 when the company announced its Cesium for Unreal plugin for the Unreal game engine. Since then there has been quite a flurry of activity and partnership announcements between Cesium and Epic Games. Patrick Cozzi (CEO of Cesium) and Marc Petit (VP and General Manager at Epic Games) even launched a podcast together – “Building an open metaverse“.

For those of us (including me) who are not fully into the game engine, Unreal is the game engine by Epic Games. The biggest competitor (or alternative) to Unreal game engine is the Unity game engine. Which game engine is better? TL;DR – indie game developers tend to go with Unity and big game houses with Unreal.

Given their roots and longstanding commitment to open-source community, it was only a matter of time before the company released a dedicated plugin for Unity as well.

Introducing Cesium for Unity

Earlier last week (November 30, 2022) Cesium officially announced “Cesium for Unity” – an open source plugin that enables real-world 3D data to be utilized in the Unity game engine. 

“Real-time 3D graphics and game engines are key enablers for an engaging and creator-empowered metaverse,” said Cesium CEO Patrick Cozzi “We look forward to the inspiring experiences the Unity community will build with 3D geospatial data enabled by Cesium for Unity.”

The preview of Cesium for Unity includes a full-scale, highly-accurate WGS84 globe for Unity, integration with Unity’s Game Objects, Components and Character Controllers, and integration with Cesium ion for access to global 3D geospatial content like global terrain, imagery, 3D buildings, high-resolution photogrammetry and more. 

The company already released a few samples of the work done using the new plugin. Quite curios to see which apps are going to be developed using the plugin. Maybe a VR version of GeoGuessr? 😉


Did you like the article? Read more and subscribe to our monthly newsletter!

Say thanks for this article (3)
The community is supported by:
Become a sponsor
#GeoDev
#Business #Featured #GeoDev
Geo Addressing Decoded, Part 3: Diving into the Key Features of Precisely’s Geo Addressing System
Aleks Buczkowski 06.29.2024
AWESOME 4
#Business #Featured #GeoDev #Ideas #News
Bee Maps: Creating a Buzz in the World of Fresh, Affordable, and Real-Time Mapping Solutions
Nikita Marwaha Kraetzig 11.5.2024
AWESOME 1
#Business #Featured #GeoDev
Geo Addressing Decoded Part 2: Beyond Coordinates – Exploring the Depth and Impact of Geo Addressing
Aleks Buczkowski 05.2.2024
AWESOME 4
Next article
NISAR Updates from Sisters of SAR
#Events #Science

NISAR Updates from Sisters of SAR

NISAR

All of us in the SAR world who routinely use Copernicus Sentinel-1 data will agree that NISAR is the next big thing. So we have news for you! For those of you who might not know, NISAR (NASA-ISRO SAR mission) is joint Earth Observation SAR mission between NASA (National Aeronautics & Space Administration) and ISRO (Indian Space Research Organisation). It is is due for launch aboard a GSLV launcher from the Satish Dhawan Space Center in Sriharikota, India in 2023. It will be the world’s first dual frequency SAR satellite sensor, with the capacity to collect & receive both L and S-band SAR data, every 6 days in the ascending and descending orbits with a repetition rate of 12 days. The baseline mission lifetime is three years and all the data will be available free and open. It also aims to acquire images with consistent geometry, look directions and incidence angles, through time, on a per-pixel basis. Now this is important as it allows not only to capture changes in backscatter per pixel, allowing for a pixel based time series analysis, but also in phase. The changes in SAR backscatter depend on target scene/object characteristics like surface roughness, soil moisture content, foliage, dielectic constants. Measurement of phase changes allows a technique called interferometric SAR (InSAR), wherein we can measure small scale changes on the Earth’s surface. The mission also plans to acquire data at high/medium spatial resolutions too, 3-10 m depending on the mode, and over most land surfaces coherent dual pol HH and HV data. 

The NISAR resources from the two space organisations are also great places to learn about the mission and the SAR techniques that it will allow. 

https://nisar.jpl.nasa.gov/mission/get-to-know-sar/polarimetry/

https://www.isro.gov.in/NISAR%20payload.html

SAR Interferometry

InSAR is awesome! And not just because it allows us to analyse and monitor deformation on the Earth’s surface from Space. When satellites acquire images of the same area with consistent geometries, such as planned for NISAR, a technique called InSAR can be used to exploit phase changes in the data and measure coherence. These can be related to changes on the Earth’s surface- monitoring subsidence and uplift, deformation post Earthquakes, deformation from volcanic eruptions,  monitoring agriculture practices like tillage and harvesting and so on. It involves a bunch of steps and if you’ve done everything correctly, and your reference and secondary images are a match, you will get these beautiful fringe patterns in front of you showing changes on the Earth’s surface in the most beautiful of colours.

SAR Polarimetry

Lets talk a little about SAR polarimetry. A dual-polarimetric SAR sensor is capable of sending the signal in one particular polarisation, either Horizontal (H) or vertical (V) but can receive the signal in either of the two polarisation. So when SAR users talk of HH, HV, VV, HH, they refer to the nature/polarisation of the signal during transmission & reception, the first letter indicating the sent direction and second, the received direction. Being able to transmit and receive signals in these different polarisations allows for a technique called SAR polarimetry wherein we can differentiate between different objects on the Earth’s surface based on how they change/alter the sent SAR signal’s polarisation. Sensors like TerraSAR.X are quad-pol sensors which mean they can send signals in two directions and receive them in two directions thereby giving us, wait for it, VV,VH,HH and HV polarised images. 

For those who have not crossed over to the Grayscale side, the SAR side, you will find tonnes of resources about SAR, SAR polarimetry and SAR interferometry on our Sisters of SAR resources page: 

https://sistersofsar.wixsite.com/sistersofsar/sar-resources

NISAR Science meeting updates

Getting back to NISAR, recently, a NISAR Science meeting/workshop took place at JPL Pasadena and Sisters of SAR was present and represented by Sarah Banks. Here are some important updates from the workshop:

  • launch is now scheduled for January 29th, 2023. 
  • S-band data will have limited coverage and will be acquired over India and a few other sites. 
  • NISAR will be using a technique called ‘Sweepsar’ to achieve very wide swaths > 240 km, maximise ratio of signal to noise, and reject range ambiguities.
  • From these data, a number of level 3 products (read also as Analysis Ready Products?) will be generated, including a global soil moisture product. 
  • The NISAR mission aims to achieve certain accuracy requirements for their level 3 products (e.g., 80% for soil moisture over a 2 hectare area)
  • Level 2 products will be in UTM coordinates and level 3 in equal area, called easy grid format (but there is still time for the community to provide feedback to change this to UTM)
  • Grid will be deterministic throughout the life of the mission, so pixel location on one scene will be the same through time, allowing users to select on pixel and do time series datacube analysis
  • Several algorithms will be available on Github so that users can generate their own level three products, including algorithms to quantify woody biomass, vegetation disturbance, wetland inundation, crop area and more.
  • NISAR will not produce a global biomass product, but code will be available to implement it
  • There is still time to change a few things (e.g., projection or add a global biomass product), but they need to know the community wants it and need funding partners
  • The NISAR team is still looking for cal-val sites for soil moisture and other products so if you work in such a team, get in touch! NISAR will have established methodologies that people can implement to make their site useful for cal val

Finally, future NISAR users can still apply to become early adopters and gain access to simulated NISAR data that they can begin testing. At the meeting, a number of different funding opportunities were announced. This includes DEVELOP which is an applied sciences training program where participants are given experience applying Earth observations by working on interdisciplinary projects and are mentored by science advisors from NASA and partner agencies.

 If you have not done so, go to the NISAR links above for more information. Or get in touch with Sisters of SAR and we’ll point you in the right direction. 


Did you like the article? Read more and subscribe to our monthly newsletter!

Read on
Search