Loading Sketchfab models in ATON!

It is now possible to directly load Sketchfab 3D models in ATON! Check out the official example.

That means you have access to an incredible amount of 3D models, that can be loaded and referenced into ATON scenes. You can also mix them with local content or other supported open formats by the framework.

Animations are fully preserved as well as materials, so the 3D model reacts consistently to different light conditions, thanks to PBR pipeline.

Such integration is possible because the framework embraces 3D standards like glTF: let’s talk about real interoperability!

When loading a Sketchfab asset in ATON – if not already set – you will be asked (once) for an API token, that you can find into your API settings:

ATON goes massive

There are amazing news regarding ATON framework and large/massive 3D datasets.

The first one is that ATON supports multi-resolution through an OGC standard: Cesium 3D Tiles. This was under the hood for the last few years: check out the reference paper explaining more in detail such choice.

You can now load one or multiple tilesets in a 3D scene, without any restriction in terms of geometry or textures complexity. This is possible thanks to the integration of an open-source library developed under NASA AMMOS project. Here is a sample of multi-resolution dataset in an ATON scene:

The second is that such standard allows amazing integrations with other pipelines, services and tools. For instance it is possible to load 3D tilesets hosted on Cesium ION, providing a token. Here is a sample of entire Boston city (multi-resolution dataset), streamed from Cesium ION and explored through Hathor (the official ATON front-end) right in your browser, through any device:

This enables ATON 3D scenes to combine geospatial datasets available on the platform with standard glTF models or panoramic content from collections. More in general, integrations are possible with any service that provides datasets in this standard.

ATON offers immersive VR presentation since 2016 (cardboards, 3-DoF and 6-DoF HMDs). The combination of WebXR + multiresolution is fully supported: that means ATON allows to explore large, massive 3D datasets using HMDs (e.g. an Oculus Quest) or cardboards without any installation (soon dedicated paper on this). Furthermore, it is possible to query all geometry, annotate, measure… at runtime! Check out a few examples in this video:

ATON was one of the first open-source solutions in the Cultural Heritage domain to combine WebXR and multi-resolution – back in september 2021. Such investigation and stress tests carried out on NASA AMMOS library also led to a nice cross-contribution between the two open-source projects in 2021.

Real-time 3D collaboration on the Web

The new version of ATON 3.0 framework includes a completely renewed “VRoadcast” component – that allows multiple remote users to collaborate in real-time inside the same online 3D scene (no installation required).

The development of VRoadcast started 3 years ago, initially to experiment collaborative features (basic chat messages) and then visualize other users as basic avatars in the 3D space. In this demo (2018) a quick communication test using VRoadcast was made involving different web browsers and… a running instance of Unreal Engine 4 with a centralized chat panel:

VRoadcast already offered incredible opportunities in the previous ATON 2.0, enabling remote users to interact in the same scene and even perform collaborative tasks right inside a standard browser. For instance it was employed to study attentional synchrony in online, collaborative immersive VR environments (Chapter 6 in “Digital & Documentation, volume 2” – open access book). VRoadcast was also used for virtual classrooms during the pandemic, allowing remote students to collaboratively populate sample 3D scenes in online sessions:

The new VRoadcast in ATON 3.0 offers improved performances, scalability, out-of-the-box collaborative features targeting CH and a simple API to create custom replicable events within web-apps (mobile, desktop and immersive VR/AR). It is now fully integrated in the official ATON front-end, so remote users and general public can already access it. More details soon in upcoming demos and papers.

These features were also employed during ArcheoFOSS workshop on ATON 3.0 to collaboratively populate a sample 3D forest and annotate together a few 3D models. Check out what ArcheoFOSS participants annotated together on this sample Venus statue!

Stay tuned: we are planning a few open-access collaborative events!
You can join our public ATON open group on telegram for more information

Developer perspective

If you are a developer, here’s a quick example to easily create a completely custom event using the new API. In this case we fire a network event called “chatmessage” containing data “hi” to other users in the same scene:

ATON.VRoadcast.fireEvent("chatmessage", "hi");

In order to handle such events, we simply subscribe to the event using:

ATON.VRoadcast.on("chatmessage", (m)=>{ 
   alert("Received message: "+ m)
})

In this example, when one user fires the event, other remote users will handle the event by showing up an alert with the received “hi”. Notice the broadcasted data can be an arbirary object. Have a look also at ATON 3.0 examples on github, and start creating your own collaborative web-app.

A DPF WebApp prototype

A WebVR prototype based on DPF framework has been deployed in order to test selected dataset from San Marino project. Several features were added to DPF.js library, for instance better a support for mobile device orientation, better semantic encoding, and built-in support for positional 6-DOF controllers such as Oculus Touch for locomotion among the DPF network, right inside the browser. Here is a quick video of recorded online browser session:

You can test out sample demo with semantic interrogation for mobile here, and a demo for desktop browsers with DPF-Network here.

rsm-single-sem

rsm-dpf-network

(Re)Building Forum Augusti

This is a first update regarding progress (july-august) on game-oriented assets production for the new Forum Augusti employed to create reusable, flexible and modular 3D components. The FBX models target utilization mostly in Unreal Engine 4 and SONY PhyreEngine (Playstation VR) – within REVEAL Project. Overall process started out from basic props and components (e.g. column, doors, braziers, etc…) most of them re-optimized to fullfil proper guidelines, including PBR requirements within VR fruition.

assets-strip

Most of per-asset workflow focused on geometry optimizations, level of detail, uv-maps and other aspects, taking into account a few considerations regarding VR best practices and in-game fruition. At this stage, produced FBX were imported in Unreal Engine in order to verify modularization, parametrization and procedural generation, described in a previous post. Such approach guarantees great flexibility for level designers or, if different hypotheses are validated during the process.

The video shows a short overview of a few experimentations right before the summer break, using produced modular assets, their parametrization and a first draft for PBR-based materials. Such tasks were also carried out in order to verify performance scaling at macroscopic level. More technical updates to come… Stay tuned.

Building Procedural Instancing Tools for Unreal Engine

When dealing with repeating 3D elements, a meaningful approach is to procedurally describe and generate them through a template and a transformation-list. The new upcoming version of ATON 2.0 for instance (yes, pun intended) already provides built-in procedural methods in order to instantiate several 3D objects by streaming a template (e.g. a column) and a transformation-list, thus greatly reducing bandwidth. On the desktop side, a porting of such functionalities from OpenSceneGraph to Unreal Engine 4 (ver. 4.16.1) began a week ago.

A C++ ActorComponent has been developed in order to read common ASCII files representing TransformList (position, rotation and scale) and a StaticMesh representing the Template. Such functionality can be employed into a compact and very basic Blueprint node (see figure) offering a flexible and manageable tool for designers, while maintaining all parsing and processing routines at C++ level. A TransformList file can be externally updated (e.g. generated by another software) and the procedural generation automatically rebuilds right inside Unreal Editor.


ue4-cpp-instancer2
A second Instancer tool has been developed (this time, fully Blueprint) to offer a customizable tool to generate randomized instances (always in Editor) in a given radius with draping, overlapping options, scale variance and much more. The following shows an Instancer with trees (top left), another Instancer with boulders (bottom left), and a third Instancer (ferns) testing adaptive location on top of world geometries (right).

ue4-inst-draping2

Adding another Instancer for grass, and tweaking of scale variance inside Unreal Editor.

ue4-inst-bp-grass-c

There are more upcoming features… Stay tuned!

VR in your browser

During the last months several updates were rolled out for Aton – right before summer vacation. The WebVR one allows users to toggle immersive VR fruition of 3D models and archaeological areas through HMDs like Oculus Rift, HTC Vive and many others in WebVR-enabled browsers.

To get started with WebVR, you can download recent Chromium builds here and start exploring published 3D scenes through Aton Front-End. Among the goals of such feature is also realizing the so-called WebVR responsive concept, where the service adapts to your viewing environment by using fluid layouts, similar to Aton responsiveness for mobile devices (HTML5 UI, multi-touch controls, simplified shaders, etc..). Next steps will focus on proper traveling handling in order to offer a smooth HMD experience.

webvr2

The WebVR update for Aton also couples with another upcoming update for the desktop application (ovrWalker), targeting performance for complex, multi-resolution 3D scenes and 3DUI research applied to CH – watch the video here. VR features in Aton will be further enriched, as well as massive upcoming updates related to node loading – so stay tuned!

Fictional 3D Landscapes: the Game of Thrones map experiment

This is an ongoing experiment aiming to create a fictional 3D map of the famous Game of Thrones TV show. Landscape Services were used to generate multi-resolution base terrain combined with multi-layer imagery to exploit Aton‘ simplified PBR rendering.

Click on the following image to launch the interactive exploration (WARNING: 3D map contains spoilers of previous seasons).

got3
Click on this image to explore the interactive GoT fictional 3D map

Starting from a base imagery map, a DEM was created from scratch (isolating water, rivers, hills, etc…): in this case no geo-referencing (through world files) is involved, since world scale and extension are… quite fictional and pretty vague. Separate scene-graphs were built to handle simplified trees (mimicking those shown in opening credits), taking advantage of instancing capabilities, as well as a water level with own multi-texturing – all composed into a global scene. This experiment also introduces the new annotation features offered by Aton, to map major locations and/or events (for now). The scene is fully evolving: digital elevation model needs some fine-tuning, new annotations and 3D models of major cities / places will be eventually added to enrich the overall composition.

got-map-wip2

More info coming soon: this post will be updated!

A new PBR Model

Another major update for Aton is about to be deployed. A lot of work has been carried out to provide a modern, efficient and real-time PBR model. A lot of inspiration comes from Unreal Engine 4 (UE4 for short) and its advanced PBR system. WebGL world of course faces several limitations that need to be addressed, sometimes in “smart” ways or using approximation techniques (special PRT and SH solutions and much more) to reduce GPU workload.

pbr-samples
A few samples using the new Aton PBR model.

Check out this demo, or this one.
The new, upcoming PBR system combined with RGBE model for Aton, supports now:

  • Base map (diffuse or albedo)
  • Ambient occlusion map
  • Normal map
  • Roughness map
  • Metallic map
  • Emissive map
  • Fresnel map

pbr-d3d-b
The new Aton PBR with the new real-time rendering engine. Textures modified from http://hrp.duke4.net/

pbr-d3d
Close-up on variable roughness, reflections and indirect light contribution (bottom cube faces)

The new PBR maps workflow also has the objective to be as close as possibile to a workflow involving UE4 (or other modern real-time PBR engines), to fully reuse such maps (e.g. “Roughness“, “Metallic” pins in material blueprint in UE4). Nevertheless, the new model is also compatible with “basic” workflow, such as the classic diffuse-only 3D modeling (or diffuse + separate AO, etc..). Screenshots below show sample workflow in UE4 using same identical PBR maps applied to cube datasets:

ue4-wf
Sample workflow in UE4 material BP using same maps.

Of course these improvements will also extend to multi-resolution datasets (e.g. ARIADNE Landscape Services) to produce aestetically pleasing 3D landscapes by providing additional maps in input section, still maintaining efficiency of underneath paged multi-resolution. Furthermore, the Aton PBR system is also VR-ready, providing a realistic and consistent rendering of layered materials also on HMDs, on WebVR enabled browsers.

pbr-vr

To import and ingest 3D assets generated using common 3D formats (.obj, .3ds, ….and much more) into Aton system and its PBR system, the Atonizer service will be soon available.

Stay tuned!

Useful Links

Light Probing system is here

A new massive update for WebGL Aton FrontEnd has been deployed.
The real-time lighting component – including GLSL vertex and fragment shaders – has been completely rewritten to feature a full light probing system, with advanced per-fragment effects.

The FrontEnd is now able to manage ambient occlusion mapsnormal-maps, specular-maps, fog and other information to correctly render different materials and lighting effects using a modern, per-fragment approach. Note LPs also contain an emission map, to consistently define light sources and also their shapes. Consistent rendering of normal maps required special attention regarding efficient computation with GLSL fragment derivatives (this will be explained in a separate post).

chrys-normap-offon
Chrysippus Head with LP: left without normal maps, right with normal maps. 3D Model created by E. Demetrescu

You can interactively change lighting orientation by holding ALT key and moving the mouse around. Try it live with:

lp-barn
Left with LP disabled, right with LP enabled. Click to open live demo and play with parameters

lp-landscapes
Light Probing applied to multi-resolution terrain datasets. 3D terrains generated by ARIADNE Landscape Services

These new features also apply to terrain datasets and, generally speaking, to all multi-resolution 3D assets published through Aton… Have a look at this landscape generated using ARIADNE Landscape Service. Yes, that means new options will be added soon to terrain service for processing multiple maps (i.e. specular maps for lakes, rivers, etc.) for 3D terrain datasets published online.

earth-shot2
An Earth model with diffuse, normal and specular maps with unreal stars

More info and details will soon be published. Stay tuned.