NGI Layers
Within the Next Generation Internet initiative, the following ten "layers" are used to categorise projects.Layer 1: Trustworthy hardware and manufacturing
Technology always manifests itself in a concrete, physical form. This may be through devices we carry with us like mobile phones, tablets or laptops. Our schools, homes, factories and offices are full of equipment like televisions, desktop computers, speakers and camera's. Increasingly technology finds its way even into our bodies, such as with medical implants. Technology is being embedded everywhere within our buildings and infrastructure. This equipment needs to be engineered and produced in a reliable and transparent way, but also needs to be maintained and capable of being protected. How can we make sure that our devices can not only withstand natural disasters, but are also not compromised and used to attack us?
Layer 2: Network infrastructure, including routing, P2P and VPN
Network technology for many people has become all but invisible, which is one of the great engineering triumphs of mankind. Yet underneath every message we get delivered on our phone, or every page we visit on the web, there is a complex fabric of radio, optical and electrical network technologies making it possible- and requiring a huge amount of handshakes, key exchanges and other invisible interactions along the way. From the physical layer (the "link layer") up, network protocols arrange for everything inbetween sender and recipient to just work.
Many network protocols were not designed with today's usage in mind, and by evolving the network technologies we can have less congestion/higher throughput, less energy usage, more robustness and better security. By using VPN technologies, we can skip unsafe environments.
The internet is obviously the largest of these networks, but it is by far not the only one. Not every network needs to connect across the planet, in fact such a dependency would be a liability: network issues on one part of the planet could affect many others. By developing new technologies using local networks via P2P or mesh technology, we can reduce global cascading effects.
Layer 3: Software engineering, protocols, interoperability, cryptography algorithms, proofs
Developing reliable and user-friendly software is an art in itself. Good software engineers make sure they create robust software, that is know to behave predictably under all circustances, is easy to maintain and capable of evolving over time retaining those properties. The truth is that much software isn't there just yet, not because the people that developed it were not capable but because our understanding of what good software engineering means has also evolved as technology has increased its role in society. These days it includes for instance reproducibility, memory safety, type safety, formal and symbolic proofs. And of course as new technologies with powerful new capabilities such as quantum computers appear on the stage, existing technologies need to be adapted to cater for this.
Layer 4: Operating Systems, firmware and virtualisation
Between applications and hardware lives the realm of operating systems, firmware and virtualisation. These should provide a solid and efficient basis for applications to run on, and protect the user from badly written software and malware.
Layer 5: Measurement, monitoring, analysis and abuse handling
Without measuring and monitoring, we cannot properly understand flaws and shortcomings of technologies or analyse their performance and quality. We also need instrumentation to understand how future technologies will interact with existing technologies.
Layer 6: Middleware and identity, including DNS, authorisation, authentication, distibution/deployment, operations, reputation systems
A lot of our interactions within the internet and other network environments are really quite complex, and we rely on many different 'sources' of truth to get us what we need. Your phone doesn't have a clue what the internet looks like, or where to find a website like https://nlnet.nl. In such situations we need the assistance of external services and systems to help us. While your computer doesn't have much knowledge itself when you turn on the machine (it is after all just an operating system, how could it understand a dynamic system that changes configuration every second), it knows how to ask other computers. For a domain name, this means using the Domain Name System. Obviously, when something performs such a critical role you can expect lots of people to abuse it - and so research is needed to evolve that system.
There are many other such interactions your computer and the computers you want to interact with need help with from other computer. Think of authentication, authorisation, but also something as simple as updating some software components or querying reputation systems.
Layer 7: Decentralised solutions, including blockchain/distributed ledger
Through so called overlay networks people can create trusted networks within larger untrusted networks such as the internet. Distributed technologies can cater for many different use cases: distributed hash tables (DHTs) are widely used for efficient sharing of content, conflict-free replicated data type (CRDT) are often used in local first applications and append-only ledger technologies ("blockchains") use aggressive replication to have a shared source of truth so there isn't a single source of failure or corruption.
Layer 8: Data and Artificial Intelligence
While technology is just that, information and knowledge live on their own, separate layer: the technology behind OpenStreetMap, Wikipedia or OpenFoodFacts is a great accomplishment, but without content they are empty. The information and knowledge brought together inside them is essentially independent from the technological building blocks. Instead it involves editorial responsibility, curation and infering, deriving and harvesting from reliable other sources. Similarly for knowledge graphs:
Of course the rise of artificial intelligence is to a great extent linked to the availability of trustworthy data, and responsible use. Making sure that untrustworthy, 'hallucinated' data is not fed back into the learning process of AI is a significant challenge.
Layer 9: Services + Applications
At the apex live services and applications that can be built on top of the rest of all these other layers. This ranges from videoconfering tools, office applications, email clients, social media to browsers, instant messengers, backup tools, games and creative tools.
Layer 10: Vertical use cases, search and community
This layer caters for discoverability and discovery, through search engines as well as through social tools. It also is used for technologies tied to specific vertical industries or use cases, such as for instance e-health.