Read about our open source vision for 2022

Avatar infrastructure for the open metaverse and web3.

Easily pipe avatars into your app, game or website.

Alter is an open source, cross-platform SDK consisting of a real-time 3D avatar system and motion capture built from scratch for web3 interoperability and the open metaverse.

Loading demo...

What you see is what you get.

The custom engine assembles avatars on the client-side from individual 3D parts, e.g., head, hair, or wearables using the Avatar Matrix. This approach allows composability across different apps and games within the Alter ecosystem. Finally, the engine assambles and then renders avatars consistently across all platforms.

Interoperability powered by blockchain.

Avatar Matrix is an on-chain, config file in JSON used instead of downloading complete 3D avatars as large, opaque, binary blobs to enable interoperability. People can hop around the metaverse with avatars in wallets. Developers and creators can spawn new ecosystems based on NFTs traits, e.g., the Loot derivatives.

  • A dramatically more powerful avatar system.
  • Animation ready.
  • Designed from scratch for the metaverse & web3.
  • Extraordinarily customizable.
  • Stunning and lively facial expressions driven by any camera or voice.
  • Cross platform.
  • Boundless materials and shaders.
  • A massive leap in avatar interoperability & composability.
  • Powerful and easy to use creator tools.
  • Own your own identity.
  • Shapeshifting 3D models.
  • NFT enabled marketplace of virtual goods.
  • Open source.
  • Get started...

    Join Discord to learn how to implement our SDK into your app, game or website.

  • What is Alter Core?

    It's a cross platform software development kit (SDK) consisting of real time 3D avatar animation system and facial motion capture for apps and games built from scratch by Alter for web3 interoperability and the open metaverse.

  • How do I integrate Core into my app?

    Complete documentation, detailed instructions, and code samples will be soon live on Github. In the meantime, we can help you directly on our Discord.

  • What can I build with Core?

    Any app or game experience that uses an avatar as a profile picture or character animations like mobile games, 3D worlds, next-gen social apps, VTubing, dating... the only limit is your imagination.

  • What is the license model?

    We're currently determining which open source license fits our project best—more details on that in the coming weeks.

  • Will Core slow my app down?

    We have optimized all features to have a minor impact on your GPU/CPU. Still, you might feel some loss of performance, especially on lower-end Android or older devices, as we need to run computer vision algorithms, run (facial) motion capture (neural net), run A.I., run (facial) expression analysis on them.

  • What is an Avatar Matrix?

    It's a plaintext configuration file in JSON format our engine uses to compose an avatar in a 3D scene on the client side. Avatar Matrix only takes little space and so it can be easily transferred over the network or saved to the blockchain. This would be hard to do with large, opaque, binary 3D files hosted in the cloud.

  • How do I create custom items for avatars?

    Soon, anybody will be able to upload or remix existing content using our creator tools.

  • Can I use different avatar styles?

    Yes. You can use your 3D models or models from creators (UGC) as long as the models include our rig to be compatible with the Avatar Matrix. This way, you also use the entire library of avatar items for customization made by Alter, creators, or brands.

  • Do I have to create items for different avatar styles?

    No. Avatar items are interoperable across different avatar styles as long as the items are compatible with the Avatar Matrix. It works like magic.

  • Do you provide actual 3D avatar files, e.g., VRM?

    Yes. We will provide APIs for that.

  • Can I take all my data with me?

    Yes. All data, if any, can be exported.

  • When Unity and Unreal?

    Unity and Unreal are big milestones on our roadmap, we do not have a specific release date yet.

  • What is interoperability?

    In the avatar context, it's the ability to take an avatar from one digital experience to another and use it. Our system uses the Avatar Matrix and the custom, cross-platform rendering engine to achieve that.

  • Why did Alter build Core?

    Years ago, feeling frustrated with existing solutions on the market, we decided to build a custom solution from scratch for our specific needs and solve other developers' frustrations. Some essential building blocks of web3 like interoperable avatars are still inaccessible to many next-gen social apps and games developers.

  • What is the open metaverse?

    It's a vision of a more immersive Internet built on decentralized and open standards, providing genuine digital property rights to its users. We believe the future of the Internet will be a combination of private and public experiences linked together by open infrastructure.

  • Who is Alter?

    Formerly known as Facemoji, we're a deep tech company focused on vision and Avatars. We exist to alter reality for others through technology. Our mission is to retool how developers and creators produce goods for virtual worlds—from creation to distribution. While doing so, have a positive impact on how you can showcase who you are to the world.

  • Does Core transmit data to Alter's or any other servers?

    No.

  • Does Core collect or store any personal data?

    No.

  • What if I have a feature request or found a bug

    Feel free to join Discord and let us know.

  • Supported platforms
  • iOS
  • Android
  • WebGL 2
  • macOS (WIP)
  • Windows (WIP)
  • Unity (Soon)
  • Unreal (Soon)
  • Avatar formats
  • Head only
  • A bust with clothing
  • Accessories only (for e.g. AR filters)
  • Full body (Soon)
  • Variability
  • Human and non-human
  • From toddler to skeleton
  • Genders and non-binary
  • Full range of diversity
  • Facial expression capture
  • 42 tracked facial expressions via blendshapes
  • Light and fast, just 3MB ML model size
  • ≤ ±50° pitch, ≤ ±40° yaw and ≤ ±30°roll tracking coverage
  • Tracking input
  • Any webcam
  • Photo
  • Video
  • Audio
  • Tracking output
  • ARKit-compatible blendshapes
  • Head position and scale in 2D and 3D
  • Head rotation in world coordinates
  • Eye tracking including eye gaze vector
  • 3D reprojection to the input photo/video
  • Tongue tracking
  • Performance
  • 50 FPS on Pixel 4
  • 60 FPS on iPhone SE (1st gen)
  • 90 FPS on iPhone X or newer

Own a piece of the web3 future.

Learn more
Alter Avatars