Loading Sketchfab models in ATON!

It is now possible to directly load Sketchfab 3D models in ATON! Check out the official example.

That means you have access to an incredible amount of 3D models, that can be loaded and referenced into ATON scenes. You can also mix them with local content or other supported open formats by the framework.

Animations are fully preserved as well as materials, so the 3D model reacts consistently to different light conditions, thanks to PBR pipeline.

Such integration is possible because the framework embraces 3D standards like glTF: let’s talk about real interoperability!

When loading a Sketchfab asset in ATON – if not already set – you will be asked (once) for an API token, that you can find into your API settings:

ATON goes massive

There are amazing news regarding ATON framework and large/massive 3D datasets.

The first one is that ATON supports multi-resolution through an OGC standard: Cesium 3D Tiles. This was under the hood for the last few years: check out the reference paper explaining more in detail such choice.

You can now load one or multiple tilesets in a 3D scene, without any restriction in terms of geometry or textures complexity. This is possible thanks to the integration of an open-source library developed under NASA AMMOS project. Here is a sample of multi-resolution dataset in an ATON scene:

The second is that such standard allows amazing integrations with other pipelines, services and tools. For instance it is possible to load 3D tilesets hosted on Cesium ION, providing a token. Here is a sample of entire Boston city (multi-resolution dataset), streamed from Cesium ION and explored through Hathor (the official ATON front-end) right in your browser, through any device:

This enables ATON 3D scenes to combine geospatial datasets available on the platform with standard glTF models or panoramic content from collections. More in general, integrations are possible with any service that provides datasets in this standard.

ATON offers immersive VR presentation since 2016 (cardboards, 3-DoF and 6-DoF HMDs). The combination of WebXR + multiresolution is fully supported: that means ATON allows to explore large, massive 3D datasets using HMDs (e.g. an Oculus Quest) or cardboards without any installation (soon dedicated paper on this). Furthermore, it is possible to query all geometry, annotate, measure… at runtime! Check out a few examples in this video:

ATON was one of the first open-source solutions in the Cultural Heritage domain to combine WebXR and multi-resolution – back in september 2021. Such investigation and stress tests carried out on NASA AMMOS library also led to a nice cross-contribution between the two open-source projects in 2021.

Real-time 3D collaboration on the Web

The new version of ATON 3.0 framework includes a completely renewed “VRoadcast” component – that allows multiple remote users to collaborate in real-time inside the same online 3D scene (no installation required).

The development of VRoadcast started 3 years ago, initially to experiment collaborative features (basic chat messages) and then visualize other users as basic avatars in the 3D space. In this demo (2018) a quick communication test using VRoadcast was made involving different web browsers and… a running instance of Unreal Engine 4 with a centralized chat panel:

VRoadcast already offered incredible opportunities in the previous ATON 2.0, enabling remote users to interact in the same scene and even perform collaborative tasks right inside a standard browser. For instance it was employed to study attentional synchrony in online, collaborative immersive VR environments (Chapter 6 in “Digital & Documentation, volume 2” – open access book). VRoadcast was also used for virtual classrooms during the pandemic, allowing remote students to collaboratively populate sample 3D scenes in online sessions:

The new VRoadcast in ATON 3.0 offers improved performances, scalability, out-of-the-box collaborative features targeting CH and a simple API to create custom replicable events within web-apps (mobile, desktop and immersive VR/AR). It is now fully integrated in the official ATON front-end, so remote users and general public can already access it. More details soon in upcoming demos and papers.

These features were also employed during ArcheoFOSS workshop on ATON 3.0 to collaboratively populate a sample 3D forest and annotate together a few 3D models. Check out what ArcheoFOSS participants annotated together on this sample Venus statue!

Stay tuned: we are planning a few open-access collaborative events!
You can join our public ATON open group on telegram for more information

Developer perspective

If you are a developer, here’s a quick example to easily create a completely custom event using the new API. In this case we fire a network event called “chatmessage” containing data “hi” to other users in the same scene:

ATON.VRoadcast.fireEvent("chatmessage", "hi");

In order to handle such events, we simply subscribe to the event using:

ATON.VRoadcast.on("chatmessage", (m)=>{ 
   alert("Received message: "+ m)

In this example, when one user fires the event, other remote users will handle the event by showing up an alert with the received “hi”. Notice the broadcasted data can be an arbirary object. Have a look also at ATON 3.0 examples on github, and start creating your own collaborative web-app.