CIA in shipping, part 2: How to ensure data integrity
Written by Walter Hannemann, Product Manager | 18 February 2020
Stay up to date!
* By subscribing to the latest news from our blog, you consent to us storing your email address, and sending you monthly emails. You can, at any time, retract this consent.
In last week’s instalment of our three-part blog article, we looked at the CIA triad, and the role it plays in maritime data security.
The CIA triad of confidentiality, integrity and availability is a security model designed to assess organisations for cyber risk and to guide policies and procedures for data security within them.
Today, we’ll talk about the second component: Integrity.
Why is it important to maintain data integrity?
Data integrity deals with the question “Do you trust what you see?”. It's important because of the principle ‘garbage in, garbage out’, meaning that if the data cannot be trusted, any information that uses it must not be trusted.
For shipping, data integrity is essential for the accuracy and efficiency of a vessel’s normal operation as well as business processes. Imagine making a vitally important operational or business decision hinging on data that is entirely, or even partially, incorrect.
In a modern maritime organisation like yours, data-driven decisions are made all the time across your fleet. With compromised data integrity, those decisions can potentially have dramatic safety, environmental and commercial consequences.
How to preserve the integrity of your vessel data
Securing data integrity means preserving their accuracy and completeness during all stages of the production, communication, storage and retrieval of the data.
To do this you need to keep the data from being changed while it is being handled, transferred, or replicated.
The acronym ALCOA is used as a framework for securing data integrity. Data is expected to be:
Attributable: Data should clearly demonstrate who observed and recorded it, when it was observed and recorded, and who it is about.
Legible: Data should be easy to understand, recorded permanently, and original entries should be preserved. Ensuring data is understandable and permanent makes it more accessible throughout the data lifecycle.
Contemporaneous: Data should be recorded as it was observed, and at the time it was executed.
Original: Source data should be accessible and preserved in its original form.
Accurate: Data should be error-free, and conform with the protocol.
Because data can be compromised intentionally or not, data integrity is not only about protecting from unauthorised tampering but also ensuring that the equipment and software don’t generate integrity problems.
Ensure information integrity from a trustworthy data source
In order to trust the information generated by processing the data, you need to also ensure that you can trust the data collected.
As more and more data comes from onboard sensors and IoT devices, you need to be able to trust these to provide accurate measurements and hence data. How this data travels and is transmitted from source to destination, where it is used to provide actionable information, is also critical. During data transmission “noise” or other impairments can lead to data errors.
Protect data integrity from unintended and malicious modification
Data integrity is also protection from unauthorised and intentional modification. Modification means the data has been changed, updated or altered due to any of multiple reasons. It can be unintended (accidental) or intended (beneficial and malicious).
A key component to achieving this is to only allow access to authorised parties, identify them and log their actions.
Monitor, verify, validate and measure data integrity
In many cases, it’s not easy to verify that data has been modified, either intentionally or not. It’s also important to identify what data is more critical than others to apply stronger controls on them.
Here are some tried-and-true best practices for securing data integrity:
Only write data that is verified to be correct (e.g. establish limits to data ranges)
Only allow the correct people/system to write data (which requires determining data ownership)
Log all changes
Have hardware and software that are up-to-date and trusted
Have reliable network and online resources
Backup - so the right data can be recovered in case a problem is detected
Once you have taken the necessary measures to keep onboard data correct and reliable, your next step to safeguarding your vessel data is to make sure the data is available, i.e. that the right users have easy and reliable access to the data when they need them.
Walter Hannemann started his career in a computer factory product development laboratory in 1983, while taking his education in Electronics and Information Systems. Since then, his jobs have involved software architecture and development, infrastructure design and overall IT management, in both large enterprises and startups. With a passion for “making things work”, shipping applications and all digital things onboard ships became his interest after joining Maersk in 2008. Managing IT in large companies like Maersk Tankers and Torm has given him insider’s knowledge in the shipping industry and enticed his entrepreneurship to help moving the industry into the digital future.
Based in Copenhagen as Product Manager for Dualog, Walter enjoys finding solutions for big (and small) problems while keeping the overview and a forward-looking approach, with deep dives in technical subjects when necessary – or possible.