Key Dates

2021

2022

2023

2024

Openmesh Expansion Program

Sep 01 - 26

Decentralized Cloud Initiative

Sep 20 - 30

Openmesh Node Sale

Oct 28 - 30

2021

2022

2023

2024

Openmesh Expansion Program

Sep 01 - 26

Decentralized Cloud Initiative

Sep 20 - 30

Openmesh Node Sale

Oct 28 - 30

Openmesh Overview

Decentralization Compromised


A critical issue in Web3 is the growing dependence of blockchain nodes being hosted on centralized Web2 infrastructure.

More than 80% of critical Web3 infrastructure—including validator nodes, dApps, frontends, databases, and APIs—relies on centralized cloud platforms like AWS and Google Cloud. Ethereum's transition from Proof of Work (PoW) to Proof of Stake (PoS) exacerbated this centralization, with more than 70% of Ethereum nodes now relying on these same centralized cloud providers.


This reliance undermines decentralization both operationally and technically, introducing avoidable vulnerabilities. If governments or regulatory authorities seek to target and terminate any Web3 service, they can direct cloud service providers, such as AWS or Google, to cease providing their services to these nodes. Similarly, a simple change in a cloud provider's Terms of Service could disrupt or even block the ability of nodes to operate.

The impact on blockchains and the dApps they support is substantial, calling for a solution that can restore true decentralization and resilience in Web3 infrastructure.



Openmesh's Solution


Openmesh is building a fully decentralized cloud, data, and oracle network that eliminates the need for middlemen and centralized servers. By providing decentralized, permissionless cloud infrastructure, Openmesh ensures that blockchain nodes, Web3, and even Web2 applications can operate seamlessly without the threat of regulatory overreach or centralized control. With Openmesh, trust is embedded in the system itself, validated by a decentralized network of independent, anonymous validators, ensuring immutability and security by design.



Principles and Governance


Openmesh is built on the core principles of Web3: decentralization, transparency, immutability, and security, ensuring sovereignty for all users. These values are deeply embedded in Openmesh's design and architecture, making it a truly decentralized solution.

Openmesh operates under a Decentralized Autonomous Organization (DAO), not as a private company. This governance model allows for community-driven decision-making and ensures that no single entity has control over the network. Since its inception in late 2020, Openmesh has invested approximately $8.78 million in research and development, entirely bootstrapped by its early founders without any external funds, venture capital, private sales, or ICOs. This self-sustained approach demonstrates our commitment to building a resilient and independent network. The DAO structure not only aligns with our decentralization goals but also fosters a collaborative environment where all stakeholders have a voice in shaping the future of Openmesh.


Openmesh's 10 Commandments


  • Decentralized Infrastructure as a Foundation. No single authority can alter the data, its services & the network

  • Censorship Resistance by Design. Peer-to-peer structure, end-to-end encryption, immutable records, distributed authority

  • Accessibility for All (Democracy and Equality). Accessible to everyone, anywhere, KYC, licensing, or payments, location, background

  • Composability & Interoperability by Design. Compatibility with various networks, devices and systems

  • Transparency by Design. All operations, and financials, including salaries and infrastructure costs, are public.

  • Local Interconnectivity. Direct communication between neighbouring nodes, optimizing data routing & file sharing

  • Redundancy and Reliability in Network Design.Multiple data pathways and nodes to ensure continuous data accessibility

  • Self-Healing Systems & Sustainability.Automatically adjusting and recovering from changes

  • Continuous Improvements for Scalability. Accommodate growth and diverse data types. Economic and Incentive models

  • Community Governance. Governed by its community of users, developers


Why Openmesh?

The internet has brought both unprecedented connectivity and a new culture to our world. Someone in the 90s would not have been able to imagine that streaming a video from their bedroom could generate more income than that of a doctor. Fast forward to today’s age, some YouTubers have more influence than politicians.

In the hyper-connected world today, if you get beaten by a police officer on the street and someone records and streams it, it's no longer just a local police department disciplinary issue; it can lead to a diplomatic crisis between nations.

The internet has changed the world for the better, for the most part. But it also comes with a price. Even though big Web 2.0 players may have ethical and moral codes, they still control who sees what, when, and why. Companies like Facebook and Google are the world's most powerful companies because, within the last 3 years, data has surpassed oil in value.


“The world’s most valuable resource is no longer oil, but data.” -economist.com

These big companies have authority and influence over the world’s data and information, and we are expected to trust them. Their advertising-based business models, personalized ads, represent over 60% of internet-generated revenue. Their “proprietary” algorithms are specifically designed to increase engagement regardless of its impact on their users.


Data is centralized.

Today, large data players like Amazon, Google, Facebook, and Microsoft control over 85% of Internet data. That is more than 100 zettabytes and growing exponentially.

Selling our attention, prediction of our future behavior, actions, and perception to advertisers for profit is the bottom line. We are the product and the puppets of algorithms.

Information asymmetry has demonstrated social inequality, scandals, polarization, and corruption have even sometimes led to war. With an enormous amount of concentrated data and information, surveillance capitalism has come to shape politics and culture. Sadly, the more power you have, the more information you can gain, leading to an advantage for only a few individuals, entities, and societies. Sure some say that’s how the world works. Maybe. But we have come a long way from kings & pharaohs. Today, most have access to information, thanks to the internet, which was only available for presidents or VIPs in the 60s.


The next generation of the web-

Web 3.0 is here.

It is early and estimated to be more complex and impactful than Web 2.0. The current web is centered around the “sharing of information.” Web 3.0 is extending this to “share of information + value.” It will introduce more advanced industries than the current Web 2.0. Imagine the futuristic business applications we have yet to discover.

Building a petabyte-scale decentralized, immutable data infrastructure.

The next generation of the web (Web 3.0) is far more complicated and powerful. We should not underestimate the power and the core principles of decentralization. Distributed ledger technologies (DLT), smart contracts, scalable public blockchains, and movements toward decentralization will play a crucial role in the next 10-20 years of the internet.

With the use of cryptographic principles, distributed ledger technologies, and modern decentralized data architecture (data mesh), we have an opportunity and a small window of time to create a New Era of data infrastructure standards for Web 3.0.

Instead of putting all trust and faith in corporations and the people behind running them, we should embrace Open Data Protocols & Infrastructures with built-in trust, security, transparency, and traceability by design. The data should be immutable, and governance of the Protocol, management, and development execution of the roadmap should not have a single authority.

Building Open Data Protocols and Infrastructures will not only open doors for new industries and exponential growth in web3, but it will also help to legitimize the industry. I believe access to quality, unedited information, is a fundamental right, and you should NOT have to pay for it.


Why should we care?

Technologies no longer double every decade. In Web 3.0's world, some technology adoption cycles are completed in less than 18 months.

If the semantic web is inevitable and far more complex than the current web, we all have a responsibility to pay attention to how Web 3.0 data will be governed and used. We should do our best to introduce appropriate standards, protocol design, and principles to establish a good foundation for the future of web3 data.

In the Web 2.0 era, innovators had amazing opportunities to invent “what is a database” and “how a web page should rank.” I believe we all have the same opportunity and responsibility today to invent, define, question, and build better technologies.

Access to information has improved societies for centuries. We all should acknowledge and do better.

“Access to truthful, unedited information is a fundamental right.”



Introduction to Openmesh

Openmesh Network is a four-year-old open-source DePIN project, with the majority of our 26 team members based in Sydney. Our team brings diverse experience from prominent Web3 projects, including Fantom, Ripple, Ethereum, Cosmos, AWS, and Aragon. Openmesh is a P2P immutable data and decentralized cloud network. It combines the best features of AWS, IPFS, Chainlink, BitTorrent, and Oracle Cloud into a single solution. Openmesh provides immutable data via OpenAPIs and offers Web2/Web3 Infrastructure as a Service through Xnode. Xnode manages Openmesh network resources, including computational power and storage, across various deployment environments. Each node operates as a microservice, enhancing the overall efficiency and integrity of the Openmesh network. 

Collectively, nodes form a network of data collectors, aggregators, and validators that facilitate a global system of data verification and dissemination. The Xnode Studio is a platform for infrastructure design, development, and management solution where users can assemble Web2 and Web3 infrastructure like Lego blocks. It enables developers, start-ups, and enterprise users to quickly build Web2 and Web3 infrastructure such as validator nodes, data clouds, building data APIs, P2P compute and storage layers, analytic engines, hosting services in minutes rather than weeks. Users can combine complex infrastructure leveraging interconnected worldwide data centers and fiber connectivity.

Xnode Studio features a powerful resource aggregation engine that scans through thousands of combinations of bare metal and cloud providers across various cities, regions, and countries to find the best resources such as compute, storage, and GPUs for your infrastructure needs, similar to how Skyscanner scans for airlines. 

All the tools and stacks are open source and available for anyone to build, innovate, and connect. All project roadmap, funding, and governance of the project are public and transparent. We see a world where data is out in the open, decentralized, and that benefits everyone, not just a chosen few. This isn't just about technology; it's about shaping the next 10-20 years of the internet, preserving democracy, and ensuring a free flow of information for generations to come.



How Openmesh does it

The philosophy is the fundamental driver of the Openmesh solution’s design blueprint. 





Openmesh considers 4 core design principles for any large scale network that benefits large scale societies and systems.  We should have open data and open IT infrastructure for the world without a single person, corporation or a government owning it.

1.  No middleman

2. Immutability by design

3. Decentralized architecture and self-sustaining 

4. Opensource and owned by community


We need to consider IT infrastructure as a public asset and should be owned by the public and managed by people. Openmesh envision a world where trust is built into the design, not into brands or people who are running it. Instead of relying on small group of people, we should have technologies that are immutable by design and verified by widely distributed independent verifiers.


Today’s world is more connected and we have access to more resources and knowledge than ever before, so everyone has greater opportunity. Web2 allowed us to share this information, Web3 permits exchange of the information and value, further promising ownership of data beyond read & write capabilities alone. It’s a chance to own the opportunity the information age brings humanity, together.


In Web3, success is driven through collaboration, not competition.




How Openmesh works

Openmesh Protocol, by design, has no single authority that can alter its data, how the information is distributed or controlled, or who can see what and when. The infrastructure is open-source and governed by an open-source community.

Unlike traditional monolithic data infrastructures that handle the consumption, storage, transformation, and output of data in one central data lake and controlled by a small group of people, Openmesh's self-serve design and data mesh architecture supports distributed, domain-specific data consumers and views “data-as-a-product,” with each domain handling their own data pipelines.

When a user is provisioning a node, it installs the core systems, low-level communications, and applications required to collect and process data onto the compute node and then adds the compute node to the cluster. The self-serve design allows users to reduce technical complexities making users able to focus on their individual data use cases. This architecture allows greater autonomy and flexibility for end users and data owners, facilitating greater data experimentation and innovation.


Core Technologies of Openmesh

  • Openmesh Cloud: Decentralized Cloud + Data + Oracle Network
  • OpenmeshPoS: Consensus mechanism in the Openmesh Network
  • Openmesh Consensus: CometBFT engine in openmesh core to facilitate synchronisation between nodes who are listening to the same data, where a node proposes message chunks as a transaction and other nodes will verify that chunk.
  • The Universal Data Collector (UDC) is a core component of Openmesh that collects, enhances, and standardizes data from various sources, including centralized exchanges and blockchain nodes.
  • Openmesh Resource Aggregation Layer (ORAL): Pools and manages computational power, storage, and other network resources across Xnodes, enabling flexible, scalable, and market-driven resource provisioning.
  • XnodeOS: Decentralized operating system designed to manage and optimize the deployment of Xnodes within the Openmesh Network. It ensures efficient resource management, security, and interoperability for running decentralized applications and services, all while supporting the seamless integration of Web2 and Web3 technologies.


Core Innovation & Breakthroughs: 

  • Decentralized & Immutable Cloud as a Service.

  • Xnode Studio: enabling anyone to rent their computers unused storage and bandwidth for hosting files to someone anywhere in the world. An Airbnb for cloud and Infrastructure.

  • Resource Aggregation for digital commodities; Like Skyscanner for CPUs, GPUs, storage and bandwidth aggregation tool that searches and connects with public clouds, private and public data centers, and independent resource providers.

  • Rapid Infrastructure-deployment-as-a-service (RIDaaS). A transformative advancement in Web2 & Web3 infrastructure design and deployment.

  • Access 100,000+ Web2 & Web3 Apps

  • XnodeOS: Fast, Lightweight, Runs on Raspberry Pi

  • Multi-purpose Infrastructure as a Service (MPIaaS)

  • World's First Decentralized Virtual Machine (DVM)

  • Node Operator Economic Viability.


Openmesh journey

We have been in the blockchain and crypto space since 2015. For years, we had to spend tens of thousands of dollars to obtain reliable data from centralized data providers. We struggled with data quality, availability, and fair price. Often we wait for weeks for the data providers' internal support tickets to get fixed. Some common data feeds cost us a few thousand dollars per month. We quite couldn't figure out why the data was so expensive. Further, when we ask how accurate the data is from these centralized data providers, we often hear, “we are professionals; we have the best tech, and all our data is accurate.” There wasn’t any way for us to know if the data that we paid for was accurate or not. We had no choice but to accept and hope the centralized data companies were telling the truth.


However, time and time again, we face numerous issues with data quality and cost. At the end of 2020, we reached our limit with our frustration. We decided to build a data lake for ourselves. We thought it would take only a few months to build. Months turned into more than a year. With many iterations, trial and error with sleepless nights, we figured out the core design and operation principles to build a data infrastructure that’s efficient and scalable.


Go to market milestones:

[ Q1/Q2 2024 ]

- Integrate Xnode with Equnix, Hivolcity, Google, Vultr, ValidationCloud, Snowflake, Aiven

- Xnode Studio V.4.0 Beta

- Decentralized Cloud Initiative 2024

- Launched World's First Decentralized Virtual Machine (DVM)


[ Q3/Q4 2024 ]

- Openmesh is giving away USD100m worth of cloud credit to Web3 startups, protocols, and DAOs to jump-start the DePIN movement.

- Artificial Superintelligence Alliance RFP - Large-scale AI-optimized data center in collaboration with Nvidia, The Xnode MDC (Modular Data Center).

- Web3 Node Adoption Initiative with Chainlink, Fantom, Polygon, Avalanche, and 6+ more chains.

- Openmesh Decentralized & Immutable Cloud: Up to 80% cheaper that incumbent web2 providers

- Xnode integrations with the Solana

- Xnode Studio v6 launch with 10,000+ apps

- Partnership with Hackernoon to host a DePIN educational content hackathon.

- Openmesh Node Sale.

- Validator Nodes Rewards Beta

- Openmesh Town Hall Event (Nvidia, Snowflake, Polygon, Databricks, MongoDB, Avien, Hivelocity, Google, and more)

- CCIP <> Xnode Launch—Connecting On-chain to Web2 Infrastructure for devs, Web3 startups, protocols & DAOs + $5m Openmesh

- Cloud Resources announcement.

- Xnode Node Hardware Launch.

- OpenD/I 2024 Conference in New York by Openmesh.

- OpenD/I Developer Conference 2024 in Bangalore by Openmesh.

- OpenAPI v6 Launch


Importance and urgency of defining how Web 3.0 data should look like and be used

Why is addressing data asymmetry important, and why should we act now?


We underestimate the power of data. We believe data is one of the most valuable commodities in the world. Defining how Web 3.0 data should be stored, shared, and distributed is very important. We are in a major transition period, going from Web 2.0 to Web 3.0.

Monopolistic Web 2.0 companies like Google, Yahoo, Oracle, and Microsoft took the opportunity to define what a database, a computer, a search engine, and a mobile application should look like. Web 2.0 innovations opened many doors, but they also made us the product by collecting our data and using it for their bottom line. Data Sovereignty and access to immutable data are important to our future generations.

Today we have the same opportunity & responsibility as in the late 90s to define what, how, and why Web 3.0 data should be stored, shared, and distributed.


At Openmesh, we chose to use code, cryptographic methods, and decentralized open architecture design to validate and introduce “how Web 3.0 data should be stored, shared, and distributed,” not relying on people, corporations, or even governments.

We are living in a remarkable time, and democratizing data and addressing information asymmetry at a scale as the transition happens will benefit billions of people.



Self-sustaining & self-evolving open protocols without the intervention of corporations and governments are on the rise.

The true believers of the Ethereum ecosystem contribute and support R&D. Community proposed roadmap is executed by thousands of developers and running 24/7. The market capitalization of Ethereum is higher than Starbucks, Boeing, Toyota, Intel Corporation, CitiGroup, PayPal, and American Express.

We have never seen open protocol become this influential and impactful. It is the new era of open technologies that are leading the industry, not the monopolistic tech built by corporations. The self-sustaining & self-evolving open protocols are the future.


The Hype Cycle

Ethereum was invented over nine years ago. Been in over 7 years in this space, and we have learned practicality, viability, and the hype. Like all technologies undergoing transformations, Web 3.0 industries have been in hype cycles. The next cycle will bring true adoption.


“Decentralized applications'' are in the “slope of enlightenment.” We will reach the Plateau of productivity in the next 2 to 3 years. In the 2000s, when the dot-com bubble burst, everyone was extremely negative and pessimistic about web technologies and the internet industry (Web 2.0) in general. No wonder the media and public perception of crypto, Web 3.0, is extremely negative today. We are going through a major technological shift.


Unlike in the early 2000s, the world is more connected than ever. We have more tech-savvy consumers who know about the value of the privacy value of their data.


For those who missed seeing the growth of the Web 2.0 internet industry, we all are privileged to witness the next generation of the internet—Web 3.0.


At Openmesh, we aim to be the decentralized open data infrastructure powered by blockchain technology.



A market for real users in Web 3.0 is emerging.

Once the hype is over, people will see the value and power of open-source, decentralized technologies. Value of open data applications, data protocol utilities, and their application. We expect, in the next two to three years, a big transformation in open data and decentralized cloud infrastructures.

Use Cases

What is Xnode?

Build a Decentralized Data Cloud with data connectivity, configure data endpoints, and build infrastructure observability, and analytics engine in minutes, instead of weeks Pay only for bare metal servers. No license, no setup fees.

Xnode is designed to empower developers with the tools to construct their own decentralized data clouds from the ground up. But it's not just about individual power: Xnodes interconnect, forming a vast mesh network, reminiscent of torrent systems, ensuring seamless global data sharing. This decentralized approach guarantees data availability, reliability, and resiliency. Furthermore, Xnode stands as a testament to Openmesh's commitment to democratizing data, aligning with the vision of a world where data is universally accessible without middlemen or hefty fees. Whether you're building a data-intensive application or seeking to harness the power of Web3, Xnode provides the foundation you need to thrive in the decentralized era.


Use cases

  1. Decentralized dApp Development Platform
  • Use Case for Web3 Startups: Startups can utilize Xnode to rapidly prototype, test, and deploy decentralized applications without needing to set up their infrastructure. With Xnode, they can access a plethora of Web3 data to power their dApps efficiently.
  • Benefit: Speeds up the dApp development process, reduces infrastructure costs, and ensures startups remain agile in the competitive Web3 landscape.


  1. Real-time Price Optimization:
  • Use Case: DEX aggregators can leverage Xnode to fetch real-time prices from multiple decentralized exchanges, ensuring their users always get the best rates when swapping tokens.
  • Benefit: Enhances user trust by offering the most competitive rates and ensures maximum capital efficiency for trades.


  1. Real-time Financial Analysis Engine
  • Use Case for Hedge Funds: Hedge funds can deploy Xnodes to tap into real-time cryptocurrency market data, DeFi transactions, and other blockchain activities. By combining live data with historical insights, they can make more informed investment decisions.
  • Benefit: Provides a competitive edge in the dynamic crypto market, ensuring timely investment strategies based on comprehensive data analysis.


  1. Web3 Data Analytics Suite
  • Use Case for Data Science Firms: Such firms can utilize Xnode to access and analyze vast amounts of Web3 data. They can build custom analytics solutions for clients, from predicting DeFi market trends to analyzing blockchain game user behaviors.
  • Benefit: Enables data science firms to offer specialized Web3 analytics services, catering to the growing demand in the blockchain industry.


  1. Decentralized Compliance and Audit System
  • Use Case for Regulators/Law Firms: Regulatory bodies and law firms can set up Xnodes to monitor blockchain transactions, smart contract events, and DeFi activities. This ensures compliance with local regulations and helps in forensic analysis in case of legal disputes.
  • Benefit: Streamlines the regulatory oversight process, ensures transparency in the Web3 space, and aids in quick resolution of legal matters.


  1. Live Asset Valuation Dashboard
  • Use Case: Game developers can use Xnode to create live dashboards displaying the current market value of in-game assets, helping players make informed trading decisions.
  • Benefit: Enhances player engagement by providing real-time financial insights within the game.

  1. Historical Player Behavior Analysis
  • Use Case: GameFi platforms can utilize Xnode to study historical player behaviors, like trading patterns and gameplay strategies, to refine in-game mechanics and tokenomics.
  • Benefit: Offers a data-driven approach to game development, ensuring player satisfaction and retention.

  1. Decentralized Research Platform
  • Use Case: Universities and research institutions can deploy Xnodes to create a decentralized research data cloud, where scholars globally can contribute to and access datasets without centralized gatekeepers.
  • Benefit: This promotes collaborative research, ensures data integrity, and accelerates scientific advancements.


  1. Global Health Data Monitoring
  • Use Case: Health organizations can use Xnode to develop a decentralized health monitoring system. Real-time health data from hospitals worldwide can be streamed into the network, assisting in rapid response to health crises like pandemics.
  • Benefit: Ensures timely and transparent access to crucial health data, leading to faster interventions and better global health outcomes.



What are Unified APIs?

Unified APIs provide a single endpoint for all crypto & web3 data, making it accessible to anyone, anywhere, without any costs. There are no licenses, registrations, or setup fees involved.


Unified API Sub-Products:

  • WebSocket: A streaming service offering live data on markets & blockchain.
  • REST APIs: An archive for Web3 data, serving as historical data-as-a-service.
  • GraphQL: This allows for the construction of schemas and enables users to specifically select the information or operations they require. Users can query to get precisely the data they've requested.


Data Available in Our Databases:

  • Crypto Exchanges: Data includes order books, trade details, transactional data, candlestick charts, tickers, as well as inflow and outflow information.
  • Public Blockchains: Information on block size, block events, asset transfers, wallet addresses, blockchain confirmations, and smart contract events is available.
  • Decentralized Finance (DeFi): Comprehensive data covering all financial transactions, including wallets, assets, liquidity, value locked, asset transfers, and asset custody.
  • Metaverses: Contains information on metaverse assets, transactions, and user activities.
  • GameFi (Blockchain Games): Data on games, user assets, user activities, game asset classifications, asset transfers, and in-game transactions is available.


Applications of Unified APIs:

  • Developers: Can leverage Unified APIs to create and empower a multitude of web applications, mobile apps, dApps, and smart contracts.
  • Data Scientists: Can integrate Unified APIs into their data science workflows to facilitate analytics.
  • Financial Forensic Officers: Can amalgamate data to build cases on money laundering, especially when ERC20 assets are transferred between wallets, a process akin to what chainalysis performs.
  • Game Developers: Can utilize the GraphQL query service to craft a Play-to-Earn analytics app, highlighting top-trading game assets on Ethereum.
  • Start-ups: Those in the process of crafting a DEX aggregator can harness the power of live streaming data (via WebSocket) and GraphQL to optimize the smart order route for token swaps.



Use cases

WebSocket - Live markets & blockchain data (Streaming service)

  1. Real-time Crypto Trading Platform
  • Ideal Users: Cryptocurrency Traders.
  • End Customer: Individual and institutional investors.
  • Application: A real-time trading platform showcasing live crypto prices, order books, and transactions.
  • Difference with Unified API: Traders receive instantaneous data, allowing them to make informed trading decisions. Given that the API is free, the platform can offer premium features without passing on data acquisition costs to users.
  • Impact: Enables traders to make timely and informed decisions, potentially leading to better investment outcomes.

  1. Live DEX Price Tracker
  • Ideal Users: DeFi enthusiasts and investors.
  • End Customer: Users of decentralized exchanges.
  • Application: A platform that showcases live price differences across various decentralized exchanges.
  • Difference with Unified API: Instantaneous price updates from different DEXs, helping users find the best exchange rates.
  • Impact: Users can maximize their returns by swapping tokens at the most favorable rates.

  1. Real-time GameFi Dashboard
  • Ideal Users: Gamers and investors in GameFi.
  • End Customer: Players of blockchain games.
  • Application: A dashboard displaying live in-game asset prices, transactions, and activities.
  • Difference with Unified API: Gamers get real-time insights into asset values, helping them make in-game financial decisions.
  • Impact: Players can optimize their in-game strategies, potentially earning more from their gaming activities.


  1. Live Metaverse Activity Monitor
  • Ideal Users: Metaverse enthusiasts.
  • End Customer: Users and investors in metaverse platforms.
  • Application: A platform showcasing live metaverse activities, transactions, and asset movements.
  • Difference with Unified API: Users receive real-time updates on metaverse happenings, ensuring they stay informed.
  • Impact: Enables users to engage with metaverse events and assets more dynamically.

  1. DeFi Dashboard for Liquidity Providers
  • Ideal Users: DeFi liquidity providers.
  • End Customer: Participants in liquidity pools.
  • Application: A dashboard displaying live data on liquidity, asset transfers, and value locked.
  • Difference with Unified API: Liquidity providers can monitor their positions and yields in real-time, allowing for timely adjustments.
  • Impact: Providers can optimize their strategies, potentially maximizing their returns.



REST APIs - Web3 Data archive (Historical data-as-a-service)

  1. Historical Crypto Analysis Tool
  • Ideal Users: Crypto analysts and researchers.
  • End Customer: Individual and institutional investors.
  • Application: A platform offering historical crypto data analytics, like price trends over time.
  • Difference with Unified API: Analysts can access vast historical data without the burden of fees, leading to comprehensive analysis.
  • Impact: Enables deeper insights, fostering informed investment strategies.

  1. DeFi Historical Performance Tracker
  • Ideal Users: DeFi project teams and investors.
  • End Customer: DeFi project stakeholders.
  • Application: A tool analyzing the historical performance of DeFi projects, liquidity, asset transfers, and more.
  • Difference with Unified API: Project teams can evaluate their growth and trends without incurring data access costs.
  • Impact: Projects can refine their strategies based on past data, potentially leading to better future performance.


  1. Blockchain Game Performance Analytics
  • Ideal Users: Game developers and GameFi investors.
  • End Customer: Blockchain game stakeholders.
  • Application: A platform analyzing the historical performance of blockchain games - user engagement, asset trading, etc.
  • Difference with Unified API: Game developers can assess their game's history without added costs, fostering improvements.
  • Impact: Developers can refine game mechanics based on past user behavior, potentially enhancing user engagement.

  1. Metaverse Land Value Analysis
  • Ideal Users: Metaverse landowners and investors.
  • End Customer: Metaverse participants.
  • Application: A tool offering insights into the historical value of metaverse land parcels.
  • Difference with Unified API: Landowners can assess the historical value trends of their assets without incurring data costs.
  • Impact: Helps stakeholders make informed buying or selling decisions.

  1. Historical Blockchain Audit Tool
  • Ideal Users: Blockchain auditors.
  • End Customer: Blockchain projects and investors.
  • Application: A platform for auditing historical data on public blockchains, verifying transactions, smart contract events, etc.
  • Difference with Unified API: Auditors can conduct comprehensive audits without data access fees, ensuring thoroughness.
  • Impact: Enhances the integrity and trustworthiness of blockchain projects through meticulous auditing.



GraphQL - Constructing schema & enabling data selection

  1. Customized DeFi Analytics Dashboard
  • Ideal Users: DeFi analysts.
  • End Customer: DeFi enthusiasts and investors.
  • Application: A dashboard where analysts can pick specific DeFi data points they want to monitor.
  • Difference with Unified API: Analysts can tailor their data feeds, selecting only what's relevant, leading to efficient monitoring.
Impact: Allows for more personalized and streamlined DeFi analysis.

  1. GameFi Asset Monitoring Tool
  • Ideal Users: GameFi asset traders.
  • End Customer: Players and investors in blockchain games.
  • Application: A platform where users can select specific games


What is Pythia?

Pythia is Openmesh's dynamic analytics and query engine, designed to transform a vast sea of raw data into meaningful, actionable insights. Pythia Analytics allows anyone to design, build, visualize, and deploy powerful Web3 data products. Users can craft data products (queries) and merge them to create dashboards tailored for specific industries or business use cases. Charts and their underlying data can be downloaded for free. Users can assign their queries, data products, and dashboards to their web3 wallet. They have the option to keep their data products private or publish them for the public, allowing other users to replicate and utilize them. Whether a seasoned developer or a newbie, Pythia enables the realization of data product visions.


Use cases

  • Crypto Market Trends Dashboard A trader crafts a dashboard in Pythia to monitor real-time cryptocurrency price trends, volume spikes, and liquidity pools, aiding in swift investment decisions.
  • Global DeFi Interest Rate Monitor Use Pythia to create a real-time dashboard that tracks and compares interest rates across various DeFi platforms, helping investors maximize their returns.
  • Celebrity NFT Tracker Track and visualize the hottest-selling celebrity NFTs. Artists and managers can identify which types of NFTs are trending and strategize their releases accordingly.
  • DeFi Performance Analysis DeFi enthusiasts utilize Pythia to analyze the performance of various DeFi platforms, assessing aspects like total value locked, liquidity provider returns, and smart contract activities.
  • Blockchain Game Metrics Game developers leverage Pythia to track user engagement, in-game asset trading patterns, and player achievements, refining gameplay mechanics to enhance user experience.
  • Metaverse Real Estate Insights Investors construct dashboards to monitor the trading and popularity trends of virtual real estate parcels across different metaverses, identifying potential investment opportunities.
  • Public Blockchain Health Monitor Network analysts use Pythia to keep tabs on public blockchain activities, such as block sizes, transaction speeds, and wallet activities, ensuring the network's optimal performance.

Vision & Mission

The Openmesh Vision

Openmesh began as a bold vision to democratize access to data and infrastructure in a world increasingly dominated by a few centralized entities. Founded by a team of experts from leading Web3 projects and technology giants, Openmesh has grown into a pioneering force in the decentralized technology space.

Our vision is to build the next generation of open infrastructure that will be pivotal in shaping the future of permissionless internet, preserving democracy, and ensuring a free flow of information.


The Openmesh Mission

Openmesh's mission is to bridge the gap between Web2 & Web3 and accelerate the Web3 industry.

Core Principles 

Openmesh's 10 Commandments


  • Decentralized Infrastructure as a Foundation. No single authority can alter the data, its services & the network
  • Censorship Resistance by Design. Peer-to-peer structure, end-to-end encryption, immutable records, distributed authority
  • Accessibility for All (Democracy and Equality). Accessible to everyone, anywhere
  • Composability & Interoperability by Design. Compatibility with various networks, devices and systems
  • Transparency by Design. All operations, and financials, including salaries and infrastructure costs, are public.
  • Local Interconnectivity. Direct communication between neighbouring nodes, optimizing data routing & file sharing
  • Redundancy and Reliability in Network Design.Multiple data pathways and nodes to ensure continuous data accessibility
  • Self-Healing Systems & Sustainability.Automatically adjusting and recovering from changes
  • Continuous Improvements for Scalability. Accommodate growth and diverse data types. Economic and Incentive models
  • Community Governance. Governed by its community of users, developers


Transparency

  • Open-source: All the tech we build is open-source, including applications used for data collection, processing, streaming services, core cloud infrastructure, back-end processes, data normalization, enhancement framework, and indexing algorithms.
  • Open Accounting: All the R&D costs and infrastructures, including salaries we pay, are public. We have a real-time display of all the running costs.
  • Open R&D: All the project roadmap and real-time R&D progress are publicly available. Anyone can participate or critique.
  • Open Infra: All our infrastructure monitoring tools, core usage, and analytics are public. The server usage, real-time data ingestion, computation, and service logs are all public.
  • Open Verification: End-to-end data encryption: All data collected and processed by Openmesh stamps cryptographic hashes, including streams, queries, and indexing processes. All the hashes are published into a public ledger for transparency.

Product & Technologies 

Xnode Studio

Overview

Today’s developers need readily available machines and pre-built apps to facilitate rapid solution deployment. Xnode Studio is an open source Graphical User Interface (GUI) that deploys Xnodes quickly on bare-metal with no additional cost while providing convenient pre-prepared services and configuration templates. How do you use it?

STEP 1: Browse and select your services.

STEP 2: Scan for available infrastructure and deploy with ease

This user-friendly, web-based management console is central to Xnode's functionality. It allows users to intuitively design, deploy, and manage their infrastructure with pre-built templates and applications.

Why?

  • Maintain full ownership and control over your infrastructure by choosing single-tenanted bare metal machines and deploying Free and Open Source Software (FOSS).
  • The versatility of a nix-based operating system means you can change the software stack quickly and safely with reproducible builds and no reboot.
  • Scalability made easy through Xnode Clusters, just add more Xnodes to your cluster.
  • Web3 Infrastructure for computer hardware resources, become apart of a growing network which will serve as the backbone data infrastructure and penultimate DePIN for the world.


How it works

Simplified Diagram of the Xnode deployment system: June 2024

Users have deployments in the Xnode Console which can be hosted either on a baremetal provider or in an Xnode Unit (XU) which is a VPS linked to an NFT.
When a user picks an xnode template, they push that template in JSON format to the backend which makes it available through an Xnode Studio API.
When that user then deploys an Xnode, using XnodeOS, an agent called the 'Xnode Admin Service' will ask the Studio API for it's configuration and then it will rebuild itself with the configured software stack.

Xnode One

We believe that better infrastructure is essential for the growth of any industry. As part of our Decentralized Cloud Initiative 2024, Openmesh is thrilled to announce a giveaway of $100 million of public cloud resource equivalence to accelerate the Web3 ecosystem.

Decentralized Virtual Machines:
Xnode One is a Decentralized Virtual Machine (DVM) that provides $3,500 USD worth of public cloud resource equivalence, offering 12 months of access to build and innovate in the Web3 ecosystem.

Xnode One is designed for developers, startups, enterprise users, and Decentralized Autonomous Organizations (DAOs) looking to build Web2 and Web3 infrastructure and applications quickly and efficiently.

With Xnode One, you can deploy validator blockchain nodes, dApps, data clouds, streaming service APIs, P2P compute and storage layers, analytic engines, decentralized integrated development environments, and hosting services. Build from pre-prepared templates found in Xnode Studio or start from scratch. The limit is your imagination.

To get started, obtain an Xnode One access card or code and navigate to openmesh.network/xnode. Follow the setup instructions provided. You can begin building and deploying applications immediately. 12 months of access begins once the entitlement NFT is sacrificed to receive its replacement active Xnode One NFT.

Build out your machine, and you’re able to transfer access via the NFT. Wipe the machine before you transfer it or keep it running.

Xnode One deploys VPS for free for 1 year.

What's happening is we're extending a third party VPS system replacing KYC with NFTs. Still centralized, but it does 3 things: 


1. Gets people using our software


2. Gets people to try out "owning" their personal infrastructure with minimal time and financial commitment.


3. Shows the world what a decentralized cloud system could look like where KYC and vendor lock-in no longer apply.


Openemsh API

Overview

Openmesh APIs is currently under development.The Openmesh API is developed to work alongside an instance of Openmesh Core to extract data from the network's IPFS layer and serve it to a variety of API implementations, this software is called Openmesh Gateway. The APIs scoped for implementation (subject to change based on user demand) are: RESTful API for historical data Websocket for real-time live data.

Legacy software: Unified APIs

Xnode v3 made use of various software and tools to use the data post-normalization and make it ready for analytics, this pipeline may or may not be reimplemented with the new architecture. Currently, this pipeline is used only for the Pythia prototype.







Users can analyse Openmesh data via a variety of entrypoints that provide unified access to all collected data. These are:

  • Pythia, an open-source data analytics tool, forked off Apache Superset, which allows for visualizations to be created using data stored in Openmesh databases. Suited for targeted use cases that don’t require the transfer of large amounts of data
  • A historical data store contained within scalable object storage and served through a Cloudflare CDN. Suited for use cases that require large amounts of historical data
  • A websockets API that streams live market events in low-latency. Suited for real-time use cases



Universal Data Collector (UDC)



The UDC is one of a core components of Openmesh. It facilitates the collection, enhancement, and standardization of a variety of data sources, extracting atomic events and packaging them into indexed data sets that can be used for live-streamed and historical data analytics.

The process begins with the raw data collection by Openmesh Core nodes through the collector, which computes the sources it is eligible to listen to each block. When a block is finalized, they connect to the next sources APIs via a websocket and pull data into chunks which it then identifies by an IPFS CID (Content Identifier). Using Boxo, this CID is then available to peers on the network allowing data to spread in a torrent-like fashion.

These processes are designed to be fault tolerant and to handle any unexpected event from the exchange or data source that could cause an outage or drop in data.

Blockchain data collectors connect directly to blockchain nodes, making JSON RPC calls to pull full, granular data to be processed and transformed into human-readable on-chain events. Openmesh has been designed to work with any connection to these nodes – it doesn’t rely on existing APIs. If users wish to run a connector themselves, they can run their own nodes or use any provider without worries for incompatibility.

After the raw data is collected, it is normalized using the standards developed by the CCXT project and if consensus is reached on the validity (so-say-we-all) of that data then it is indexed on the blockchain in Openmesh Ledger, making it available as a resource to the network.

Blockchain data requires special treatment, as there are smart contract events which do not exist in the block header such as those relating to DeFi protocols which will need to be processed and stored under an index relating to the selected data topic (eg. Uniswap). Not all contract event data is necessarily collected, only that which is decided to be a valuable source according to the OpenmeshDAO.

Being a community-focused open source project, the Openmesh Core has been designed to make it as easy as possible for community members to expand Openmesh data coverage, with highly modularised and containerised code that can easily extend to any arbitrary data source. The high-level steps to add an additional data source are:

  • Define the endpoint, data topics and methods in sourcedefinitions.go
  • Define the normalization procedures if they do not fit into existing functions in sources.go
  • Submit a pull request, and initiate the DAO governance process for data-onboarding, the data source will be introduced to Openmesh as another data source in the network.



Pythia

Pythia is under active development


Introduction

Except for limit order book updates, we store all of our data in a PostgreSQL database. Users can query this database directly via an interface. For this, we're using Apache Superset, an open source business intelligence tool. For detailed information on how superset works and for an exploration of the full feature set on offer, you can visit their documentation.


Access

The service is hosted at query.tech.openmesh.network.


Technical Architecture

Pythia connects to our main datastore, a Postgres database, and executes user-defined queries in an efficient manner. When a user enters the site (whether they are authenticated or not), they can use a SQL IDE to perform analysis on any of our public tables.

Queries are ordered in a queue and executed on the core database directly.

For authentication, Pythia relies purely on Web3. User credentials are stored in a Postgres backend, but no passwords are required.

The authentication flow is as follows:
1- The user requests a random nonce from the server
2- The server generates a random nonce and assigns it to the user’s address in the database, returning it to the user
3- The user signs the nonce with their Ethereum wallet, and sends the signature to the server
4- The server verifies that the signature matches the nonce and the address stored, logging the user in if the verification passes

Only the user’s address is stored, no other personal details are required. The user’s address is then used as the default username.

Pythia also has a caching layer that minimizes the required resources for creating data products.
When a user executes a saved query, e.g. on a dashboard or for an individual data product, the query result is cached in a Redis instance, and the next time somebody accesses that query, the result is instantly returned from the cache, saving on computation.
Users can specify the timeout of the cache when they create a data product, if they would like more up-to -date results.

Results can be displayed as raw tables and visualizations, but the full datasets can also be downloaded and exported in a variety of formats, including CSV and JSON.
Charts can be shared and embedded, and they will be tied to your wallet. We limit the amount of rows that can be exported, so users are encouraged to perform aggregate queries that can still perform complex analysis on hundreds of millions of rows.

Additionally, Pythia enhances the user experience by providing an intuitive interface that translates natural language queries into sophisticated SQL commands, enabling seamless interaction with Openmesh’s data.

In the future we will boast even more expansive data coverage, including multiple blockchains, and full trade histories for symbol pairs. User queries and query results will also be encrypted by the user’s public keys, meaning that the data you collect and analytics you perform will be end-to-end encrypted.

The Openmesh community will also be involved in creating new visualization types and further customization options.


Innovations

Openmesh is built on the core principles of Web3: decentralization, transparency, immutability, and security, ensuring sovereignty for all users. These values are deeply embedded in Openmesh's design and architecture, making it a truly decentralized solution. Openmesh operates under a Decentralized Autonomous Organization (DAO), not as a private company. This governance model allows for community-driven decision-making and ensures that no single entity has control over the network. Since its inception in late 2020, Openmesh has invested approximately $8.78 million in research and development, entirely bootstrapped by its early founders without any external funds, venture capital, private sales, or ICOs. This self-sustained approach demonstrates our commitment to building a resilient and independent network. The DAO structure not only aligns with our decentralization goals but also fosters a collaborative environment where all stakeholders have a voice in shaping the future of Openmesh.

Core Documentation

  • API Documentation

Detailed guides and references for all Openmesh APls.

https://docs.openmesh.network/products/openmesh-api

  • User Guides

Documentation created by the community for the community.

https://github.com/Openmesh-Network

  • Integration Guides

Instructions for integrating Openmesh with other tools and platforms.

https://www.openmesh.network/xnode/docs

  • Our Core Princples

Best practices and protocols to ensure the security of your data and operations.

https://docs.openmesh.network/openmesh/our-core-principles

  • Developer Tools

Resources and tools specifically for developers building on Openmesh.

https://github.com/Openmesh-Network

  • Troubleshooting

Comprehensive guides to help you resolve issues and optimize performance.

https://www.openmesh.network/xnode/docs

Roadmap

 

Governance & Transparency

Our Commitment to Governance and Transparency

At Openmesh, we believe that strong governance and transparency are essential to building trust and fostering innovation.
Our governance framework ensures accountability, strategic oversight, and ethical conduct across all our operations. We are committed to maintaining openness in our processes and decisions, empowering our community through transparent practices.

Robust Governance Framework
Openmesh's governance framework is designed to ensure accountability, strategic oversight, and ethical conduct across all operations. Our Board of Directors and specialized advisory committees offer expert guidance on technical, ethical, and strategic matters.

Leading with Transparency
Transparency is central to Openmesh. We share decision-making processes and financial reports with our community to build trust. Regular updates on our progress, challenges, and milestones are provided through blogs, reports, and community calls to keep stakeholders informed and engaged.
  • Open Decision-Making: We share our decision-making processes and rationale with our community to promote trust.

  • Financial Dislosure: Our reports are publicly available, providing insight into our funding, expenditures, and financial health.

  • Regular Updates: We provide regular updates on our progress and milestones through blogs, reports, and community calls.


Upholding Ethical Standards

Openmesh is committed to the highest ethical standards, guided by our comprehensive Code of Conduct. We ensure compliance with all relevant laws and regulations, and our robust whistleblower policy protects those who report unethical behavior.

Engaging with Our Community
We actively involve our community through open forums, surveys, and polls, gathering valuable feedback and insights. Regular events and webinars keep our community informed and engaged with our governance practices.

Infrastructure

  • XnodeOS

XnodeOS is an operating system derived from work done by the NixOS project but specialized for certain usecases such as web3 nodes and data infrastructure. The operating system is built for several deployment methods including iPXE netboot, ISO and kexec to make it compatible with as many cloud providers as possible, since each can differ in their API capabilities.
Whilst currently in very early stages, this operating system gives openmesh a platform to build it's infrastructure technology on very few dependencies which minimises overhead and upstream dependencies. Being nix-based, deployments are also highly stable and reproducible.

Next features to be added

Currently, the Studio API system enables direct JSON responses to be received by Xnodes for them to convert into nix code and reconfigure themselves, however this system lacks several important features such as version control, rollbacks, CI/CD tests, etc.
Reconfiguration on Xnode will be expanded to support a git-based configuration where the Xnode Studio will directly apply git commits to a repository from the browser, using isomorphic git.
This system would enable the previously mentioned features to be built and makes it easier to fully self-host the infrastructure to manage Xnode.

Scalability: If there are a huge number of Xnodes being reconfigured through this infrastructure then PowerDNS can be employed to notify Xnodes through a TXT record when they have a new configuration to pull. This event-based solution would reduce the strain of constant querying (by a search interval) from many Xnodes to the configuration infrastructure.

Reliability / Stability: By enabling reconfiguration infrastructure to be self-hosted by anybody, there can be more than one provider for git hosting, enabling users to avoid relying on a centralized entity.

  • Openmesh Core

Openmesh Core is a crucial piece of software that is responsible for maintaining consensus among different validator nodes in the network. Openmesh-core utilizes the Tendermint consensus protocol, which is a Byzantine Fault Tolerant (BFT) consensus algorithm, to ensure secure and reliable blockchain operations. The specific implementation of Tendermint used by Openmesh-core is CometBFT, written in Go.


Responsibilities of Core

  • Block Proposal:

    • Selection of Proposer: The core is responsible for selecting the proposer (validator) for each round in a fair manner, typically using a round-robin or weighted round-robin approach based on stake.

    • Block Creation: The selected proposer creates a new block with valid transactions and broadcasts it to all validators.


  • Voting Process:

    • Pre-vote Collection: The core collects pre-votes from all validators for the proposed block. Each validator votes based on the validity of the block.

    • Pre-commit Collection: After pre-votes are collected, the core collects pre-commits from validators who have received sufficient pre-votes.

    • Final Commit: The core ensures that if a block receives more than two-thirds of the pre-commits, it is added to the blockchain.


  • Transaction Validation:

    • Transaction Verification: The core verifies the validity of transactions included in the proposed block. This includes checking digital signatures, ensuring no double-spending, and validating any other protocol-specific rules.

    • State Transition: The core applies valid transactions to the current state, producing the new state that will be agreed upon.


  • Network Communication:

    • Message Broadcasting: The core handles the broadcasting of proposal, pre-vote, pre-commit, and commit messages to all validators in the network.

    • Synchronizing Nodes: The core ensures that all nodes stay synchronized, resolving any discrepancies in state or blockchain history.


  • Fault Tolerance and Security:

    • Byzantine Fault Tolerance: The core implements mechanisms to handle up to one-third of nodes being malicious or faulty, maintaining consensus despite these issues.

    • Punishment and Slashing: The core is responsible for detecting and punishing misbehavior, such as double-signing or other forms of protocol violations, by slashing the stake of malicious validators.


  • Consensus Finality:

    • Immediate Finality: The core ensures that once a block is committed, it is final and cannot be reverted, providing security and confidence to users and applications relying on the blockchain.


  • State Management:

    • State Updates: The core manages the application of committed transactions to the state, ensuring that all nodes have a consistent view of the blockchain state.

    • Data Integrity: The core ensures the integrity and correctness of the state data, preventing corruption and inconsistencies.


  • Protocol Upgrades:

    • Consensus Upgrades: The core may facilitate upgrades to the consensus protocol, ensuring smooth transitions without disrupting the network.

    • Feature Implementation: The core implements new features and improvements in the consensus mechanism, enhancing performance, security, and functionality.


  • Data Validation and Seeding:

    • Data Submission: At each block each node is assigned a different data source to fetch data from until the next block. The data is than seeded by the core via IPFS and a CID of the data is submitted as a transaction.

    • Data Seeding: The core is responsible for seeding the Data in the network. The core seeds the data via IPFS.



  • Data Flow

Note: Kafka is no longer used in Openmesh's data collection pipeline due to it's centralized architecture, see: Openmesh Core
We use Apache Kafka for transporting our messages through our pipeline. Kafka is inherently distributed, scalable, and fault tolerant – essential for our high throughput, low latency, and critical data. The Openmesh infrastructure has been built from the ground up for horizontal scalability, leveraging Kafka’s parallelism and distributed architecture to ensure we can always keep up with intense throughputs.

CEX data, once collected, is all produced into a raw topic, which has semi-structured JSON that’s partitioned by exchange-symbol tuples. Blockchain data is already structured, so it’s produced to different topics in an Apache Avro format. Apache Avro is a data format that is JSON-like, but preserves schema information and minimizes sizes.

At this point, a network of stream processors consumes the data. This is a massive consumer group that ingests all raw market data to process it. Thanks to partitioning, all of these processes run in parallel, yet events for specific exchanges and symbols are guaranteed to be processed in order. All raw data is also archived in object storage as compressed JSON. Blockchain data is also processed, specifically smart contract events, which can have important data points included within that need to be processed to be revealed.
Once processed, the data is sorted and produced to a collection of standardized topics, which have consistent schemas for each message. At this point, all of the data is structured and so is encoded as Avro.

Processed data is archived into object storage by a process which converts the Avro data into Apache Parquet, another data format which is column based and has tiny file sizes, suitable for structured data that is to be placed into object storage. This periodically collect chunks of data, processes it into a single file, and stores it based on a datetime partition. The file structure organises events by year, month, day, and hour, for efficient indexing and retrieval.

Another process places the data into a Postgres database, where it is indexed and queryable. PowerQuery connects to this database directly, where users can write arbitrary SQL and perform advanced analytics. This process processes up to 1000 rows at a time, placing each event topic into its own table.

Finally, the data is consumed by our websocket broadcaster consumer group which streams out market events to our users. These are typical Kafka consumers which micro-batch messages to increase throughput. The consumer group is immensely scalable, and constant rebalances occur as the group scales up and down to meet throughput demands and handle any drops.


  • Connectors

Note: Kafka is no longer used in Openmesh's data collection pipeline due to it's centralized architecture, see: Openmesh Core


Data connectors are the first step in the Openmesh data pipeline. These connect to data sources, both on and off chain, to ingest granular data in real time to be processed by the pipeline.

Each connector is containerised and orchestrated by a Kubernetes cluster. They are each stateless, able to be turned on and off quickly and without any issues. They are also isolated from each other, so in the event of a catastrophic failure of a single exchange, every other exchange will be completely unaffected. This also means we can add more sources and data without fear that the system won’t be able to scale.

Precautions have been taken to ensure that data isn’t missed when an unexpected event happens on the exchange. Many possible scenarios and status codes are accounted for, e.g. in the event of a 429 (rate limit), the connector will wait a certain amount of time before connecting again. In the event of a stale connection, a background process will pick up that no data has been sent over the connection, immediately alarming that the connection ought to be restarted.

For blockchain connectors, we connect directly to nodes via JSON RPC. Over websockets, the connectors call subscription methods to receive updates whenever a new block is minted on the chain. Afterwards, over HTTP, methods are called to receive information about the block, including all transactions that have occurred and all smart contract events. As an open source project, Openmesh has been designed to work with any connection to these nodes – it doesn’t rely on existing APIs. If users wish to run a connector themselves, they can run their own nodes or use any provider without worries for incompatibility.

The codebase has been developed to support any arbitrary data source in the future – a modular and flexible structure means that the community can extend the Openmesh source to connect to any source of crypto data. As an open source project, the community is encouraged to add any data source that could augment our coverage, contributing to the network and making even more data available. In the future, the process for adding additional data sources will be streamlined, and users will be able to monitor the status of all connectors in a dashboard that showcases the full lifecycle of development. A public ledger will also be published which will enforce immutability in the collected data, providing a layer of trustless security that will give the cryptocurrency community a guarantee that our data is valid and accurate.


Tokenomics

Token distribution

Loading...

Openmesh DAO 

Openmesh's DAO governance model is designed to make key decisions such as resource allocation more democratic. This model empowers our core team to vote on critical issues, ensuring that decisions are made transparently and in the best interest of the community.


As this DAO decides who has the Verified Contributor status, which is required to enter a department, it can be seen as "Department Owner".


This is the only responsibility of this DAO, managing the group of verified contributors by adding any new promising candidates and removing those that aren't involved enough, abusing the system, or any other kind of harm is done to the Verified Contributors by keeping them.


Optimistic actions

As adding Verified Contributors is not a very harmful action and it is expected to be done regularly, this can be done optimistically. This means that an existing Verified Contributor creates the action to mint a new NFT to a promising candidate, explaining why they made this choice. All other Verified Contributors will then have 7 days to review this action and in case they do not agree, the opportunity to reject it. Only a single rejection is required to prevent the action from happening.


In case someone creates suspicious actions or is rejecting actions without good reason, this could be a reason to remove their Verified Contributor status. It is recommended to mention it to them beforehand before taking this last resort.


Voting actions

Any other actions will need at least 20% of all Verified Contributors to vote on it and the majority (>50%) to approve it.

The only action that is expected is the burning of NFTs. Any Verified Contributor who knows they will stop interacting with the platform can also burn their NFT themselves, to save the DAO the hassle of voting.


The Verified Contributors are in complete control of this DAO, so they can decide on any other actions they want to perform, such as building up a treasury.


https://ovc.openmesh.network/

Opencircle 

Learning, Networking, Earning, Growing & Influencing Web3

A go-to place to learn, meet people, connect, and apply what you've learned. Find projects to work on and earn for your contributions, regardless of where you live and work, through OpenR&D.


  • Academy:

    OpenCircle Academy is not just about learning; it’s about mastering the intricacies of Web3 technology. From blockchain basics to advanced decentralized finance (DeFi) systems, our curriculum is curated by industry pioneers and updated regularly to reflect the latest trends and technologies. Interactive courses, hands-on projects, and real-time simulations offer you a practical learning experience that goes beyond just theory.


  • Open R&D:

    OpenR&D is proud to be one of the pioneers of R&D platforms in Web3. We are empowering developers, engineers, and DAOs to improve the developer experience and project management through a truly decentralized R&D platform built for scalable innovation in distributed engineering.

Community

We are committed to quality technical and academic research on modern data architecture, data mesh, scalable consensus algorithms, and everything open-source!


Join our journey!

Follow us every step of the way and get the latest updates about our tech and our events.

Network with web3 developers and enthusiasts.
Interact with our developers and our whole community.
Check out our repositories and contribute.
Check out our team and learn about the latest events.
Get the latest updates and announcements.
Get the latest publications and announcements from our team.
Get the latest documentation and resources.
Get the latest walkthroughs about our tech.
Join our forum and contribute ideas.

OpenR&D

Our vision is to empower Web3 projects and teams to collaborate seamlessly


OpenR&D enables information to flow freely, enabling efficient project management and distribution of work. We are committed to making development and payment more open, transparent, and accessible for everybody.


Addressing pain points in Open Source and DAOs

Open-source communities have a variety of problems when it comes to development and coordination. Challenges exist when holding contributors accountable and adhering to any sort of law. By implementing a DAO structure, core teams are able to assemble and create a structure that enables them to become the final decision-makers for a variety of responsibilities.


However, we need an environment to leverage developer experiences where developers can perform better while maintaining truly decentralized governance. These are issues that we specifically addressed with our R&D development portal. We aim to improve the developer experience while maintaining some control of the development and introduce mechanisms to incentivize proper behavior and developer empowerment. Importantly, we wanted this portal to scale as we do, and not limit our ability to grow or be agile.


Empowering developers and engineers

Empowering developers is crucial to scaling engineering. There are a variety of obstacles within centralized businesses that prohibit developers from owning their work and sharing it. Using a platform like ours, developers will have a more liberating experience that provides access to task lists with complete transparency.


DAO friendly

OpenR&D supports the usage of Decentralized Autonomous Organization (DAO); an entity structure with no central authority which is governed by the community members. Members of the DAO, who own community tokens, can vote on initiatives for the entity. The smart contracts and code governing the DAO's operations is publicly disclosed and available for everyone to see.

Openmesh Network Summary

Loading...

Openmesh partners

Primary resource allocation

Loading...

Openmesh Expansion Program (OEP) 2024 Participation

Loading...

Participation Logistics

1. Sponsors Meetup

Selected sponsors will meet the Openmesh team to discuss the roadmap and understand how their contributions will accelerate Openmesh's expansion. This event will also serve as an opportunity for in-depth discussions and clarifications regarding the sponsorship process.

2. Payment and Confirmation Phase (Sep 01st - September 26th)

Whitelist and payment confirmation.

3. Post-Participation and Minting Phase (Sep 26th, 2024 - Sep 26th, 2025)

Sponsors will receive Cloud Credits, redeemable as Xnodes, which can be utilized to support the Decentralized Cloud Initiative (DCI) and other Openmesh services, or converted into sOPEN or OPEN tokens. As early backers, sponsors may also receive governance tokens as a token of appreciation for their support. These tokens will enable sponsors to contribute to the network's governance, particularly in roles such as network operators, Verified Contributors, and Openmesh Verified Resource Providers.

© Openmesh. 2024. All rights reserved

Language & region:

English (AUS)

ABOUT

Products

Resources

COMMUNITY

© Openmesh. 2024. All rights reserved

Language & region:

English (AUS)

ABOUT

Products

Resources

COMMUNITY

© Openmesh. 2024. All rights reserved

Language & region:

English (AUS)