Life2Ledger data authenticity and provenance

Distributed Ledger Technology Expertise for Datasets from IoT and IoMT devices

Het PGO Netwerk Noord Use Case Project wordt mede mogelijk gemaakt door het Europees Fonds voor Regionale Ontwikkeling van de Europese Unie.

logo

"THE OBJECTIVE:"

Moving healthcare to Web 3.0

Verifying authenticity of data in a connected world of Io(M)T


Life2Ledger Blockchain Solutions

All (sensor) data that have or could have any medical, juridical, financial/economical or scientific value, must have guaranteed authenticity and demonstrable provenance

Big and Privacy Sensitive Data Authenticity Solutions for IoT and IoMT

Life2Ledger data authenticity and provenance

Data must be proven authentic to

  • prevent accidents due to data pollution
  • retain scientific value
  • be used during Machine Learning
  • be used when training the algorithm for AI
  • prevent fraud and other criminal activities
  • stand in court of law
  • Medical data must also be GDPR compliant

    Solution =>

    Authenticity with Blockchain Technology

    Premise

    There will be an explosion of data during the coming years. It is estimated that the Internet of Things (IoT) and the Internet of Medical Things (IoMT) alone will be generating 160 Zettabytes(1 Zettabyte = 1 billion Terabytes). These data must have guaranteed authenticity in the broadest sense of the word.

    When we speak of the IoT or IoMT, we generally refer to networked-connected devices whose purpose is to gather and disseminate data autonomously and, increasingly, to act autonomously upon this data to carry out tasks, as for example a network-connected printer that orders its own supplies when they run low. We employ a very broad definition of IoT, including all types of connected sensors and meters, actuators (devices that do something in the physical world, like robots or drones) as well as the software that runs them.

    The main challenge then, is how to secure these data that are transmitted over the internet from one device to a central authority or server and make sure that

  • the data we are looking at are the data we're supposed to look at
  • that the data measured at time t0 are still identical to the data we read at tn.
  • the authentic data at the same time GDPR compliant in case we are dealing with, e.g. patient data.
  • Also the tsunami of Io(M)T devices that will come on stream will have to be managed and administered. This calls for solutions build on new technologies, e.g. Distributed Ledger Technologies and Smart Contracts.

    logo

    Data pollution

    Non-authentic data due to accidental change, tampering or hacking

    Pollution of data can be defined as data being changed during- or after measurement, either by accident, purposely (for scientific reasons, e.g. removing outliers) or deliberately (with evil intention).
    The importance of data authenticity may be illustrated with the following examples.

    Data pollution during ML processes and AI

    The use of polluted data could introduce a bias during machine learning as well as processes that are meant to train the algorithms for artificial intelligence. The consequences could be dire as wrong decisions could harm persons and entities and cause physical injury or even death due to erroneous diagnostics with AI during medical treatment. The explosion of data from IoT/IoMT devices urges for certainty when working with data. Data files and image (DICOM) files should be guaranteed authentic. When the image data are corrupted, important information could be lost and cause mal judgement by the medical practitioner. Also, the practitioner wants to be 100% sure he is looking at the correct, unpolluted data of the right person.

    Tampered data to influence (legal) outcome

    Imagine a database containing data being used during a court case by two parties A, an environmental activist group, and B, a party accused of being a polluter. An honest judge should always ask the question, "How can I be sure the data presented to be from the database are not polluted?" The answer is by using blockchain technology.

    Research

    Often M.Sc students or PhD candidates want to make use of existing data from a remote data base. Question is, how can the researcher be certain that the data are not polluted? The answer is again, distributed ledger technologies like blockchain.

    logo

    logo

    The PEBL™ protocol stack is build up from the newest transmission technologies guaranteeing authenticity and provenance of IoMT data.

    Solutions & New Technology

    Life2Ledger uses two blockchain technologies, the Algorand non-permissioned blockchain technology and its own propriety Permissioned Ephemeral Blockchain, PEBL

    Algorand non-permissioned blockchain

    Algorand is a blockchain project whose goal is to create a transparent system in which everyone can achieve success through decentralized projects and applications. Algorand Foundation provides serious assistance to developers in the launch of the network and its development. The main feature of this project is the open source public blockchain. Algorand’s consensus mechanism promotes the performance, security and openness of a decentralized network. With it, transaction throughput Algorand competes with well-known financial and payment systems.

    PEBL

    Life2Ledger has created it's own propriety DLT, i.e. PEBL, which stands for Permissioned Ephemeral Blockchain. PEBL is stateless and build on top of the already existing TCP - TLS - HTTP/2 - gRPC protocol stack using our propriety Fuzzy PoW algorithm. This will guarantee data authenticity during transmission of data via IoT/IoMT devices. PEBL is an amalgamation of cryptography and statistical tools to provide for a robust chain during data transmission.

    IT Security

    It must be emphasized that Life2Ledger is not specialized in IT security issues; however, due to our technology and solution, security will be enhanced.