Connected devices handle a lot of data. Sometimes, that data is sensitive, personal, confidential, or proprietary. Those devices and the services they connect to can present an undesirable risk if they allow malicious actors to gain access. Cybersecurity from device to cloud is critical in minimizing those risks.
Cybersecurity from device to cloud entails taking a connected device with a trustworthy identity that runs trustworthy firmware and stores/generates trustworthy data, then connecting it to a trusted service in the cloud and associating it with a known and authorized user account.
More simply, it boils down to the following things:
Provisioning is the process of taking an end device with a unique identity and connecting it to a cloud service via the internet, validating that unique identity, and associating that device with a valid user account.
To validate a unique identity, a unique identity per device first needs to be established. This identity ideally should be tied to hardware and made to be unmodifiable. Establishing a unique identity per device can ideally be done when manufacturing the device. This can be done with the use of a hardware-based unique ID chip, or it can be stored in one-time-programmable EPROM or E-fuses. Using a hardware-based unique ID chip can make it easy to guarantee uniqueness, as long as the identity chips adhere to a standard that guarantees uniqueness (such as EUI-48 or EUI-64), and the same standard is used to provide this identity across the device fleet. The main thing is that the process of assigning IDs is rigorous enough to guarantee that no two devices will be assigned the same ID within the fleet. Once the devices have a unique identity, at manufacturing time, those IDs should be noted and controlled, so that those IDs are allowed to register during provisioning.
The next step is for each device to generate a public/private key pair and have a certificate created and loaded based on that key pair. To keep the private key safe and secret, you will want to use some sort of secure hardware-based key storage, like a TPM or a cryptography chip like Microchip’s ECC chip, or something baked into your device’s microprocessor, like Arm TrustZone, if that is available.
Ideally, no. Self-signed certificates have their own challenges regarding their ability to validate authenticity. Instead, the device should create a certificate signing request, and have a certificate assigned to it that has been signed by an entity that is trusted by both the end device and the cloud service. This process of generating a securely stored private key and generating a certificate and having that certificate signed by a trusted entity creates a reliable chain of trust.
x509 certificates (the same certificates that are used for SSL/TLS encryption) provide a way to trace back to a trusted entity by validating signatures. That trusted entity produces its signature by calculating a checksum over a set of parameters in the generated certificate, then encrypting that checksum with its private key. The public key that pairs up with that private key is included in the certificate along with the signature. Whenever validation takes place, that same checksum calculation is made, but instead of encrypting the checksum, the signature is decrypted with the public key. If the calculated checksum matches with the checksum decrypted from the signature, then the signature is assumed valid.
A certificate chain is a set of certificates that eventually lead back to a Root Certificate Authority (Root CA) that is shared by the device and the service. Typically, the device certificate will be signed by an intermediate signing authority, which can be either the manufacturer or a provisioning service. The certificate of that intermediate signing authority can be signed by either another higher signing authority or directly by the Root CA. Eventually, the chain of signatures must lead to the Root CA to be trusted.
Validation occurs when all the following things happen:
We also use x509 certificates to validate that the provisioning service (or any of the other cloud services) is trusted. Each service that is expected to work with a device also needs to trace back to a trusted entity.
Each cloud service also has an x509 certificate with its identity assigned to it (usually the hostname portion of the URL of the service, or perhaps a wildcard identity that includes the hostname of that service). Those x509 certificates must also be signed by a trusted entity that is known to the device. At manufacturing time, the device must have the certificate of the trusted entity (Root CA Certificate) stored somewhere safely. That way, all communication can be secured using TLS encryption.
Aside from provisioning, communication between the device and the cloud can also be authenticated with what we call mutual authentication, where both the device and the service present certificates for authentication.
For provisioning, certificate assignment can be done in one of two ways:
Ideally, private keys should be managed by a Key Management Service of some sort that is associated with the customer account of the device manufacturer. That Key Management Service instance should be backed by a physical hardware key storage device. That way, keys can be stored safely and securely.
Well, one thing we’re worried about is a malicious actor attempting to impersonate an end device or the cloud service. Assuming the process of securing the communication channel is not faulty, the impersonation of a device could involve gaining access to the identity of the device, including the private key. If a malicious actor were to gain access to our private key, they could initiate communication with the cloud services and impersonate our device. We mitigate this first by protecting that private key by using a hardware-based key storage device, as we mentioned earlier. Second, we make sure the device is designed such that the private key never has to leave that key storage device, so it is never exposed.
Another thing we’re worried about is the device inadvertently leaking important data. Perhaps the configuration for establishing secure communications is baked into your device firmware or stored in some sort of configuration file. If the location of that configuration were to leak, a malicious actor could attempt to modify the configuration to gain access while bypassing authentication.
We mitigate this first by encrypting firmware and important data. By encrypting firmware and important data, we meaningfully obscure those inner workings and make it infeasible to decipher without the encryption key. Second, we make sure the device validates firmware and important data by signing firmware and validating firmware signatures. By using and validating a firmware signature, we ensure that the firmware that runs is from a valid source. Firmware signatures also require the use of a public/private key pair, and the private key must be kept secret for this mitigation to work.
One last thing to protect against is a malicious actor attempting to impersonate the cloud services. This is largely mitigated by following best practices for configuring and enforcing strong TLS encryption for all cloud services, as well as following best practices for managing permission and authorization to access and configure those cloud services. You must also follow best practices for configuring and enforcing strong TLS encryption at each end device, by validating certificates against the Root CA Certificate of your trusted entity. If a malicious actor is attempting to impersonate your cloud service but cannot replicate the required signature because they do not have access to the private key of your trusted entity, that impersonation attempt fails.
When you are warned of a problem with certificate validation, you are typically presented with an option. On a device, the firmware is typically going to be programmed to automatically choose without presenting an option. In those cases, it is good to automatically reject a certificate that cannot be validated. Often, a device manufacturer will run into an issue in production where a certificate expires or perhaps some part of the configuration of a service gets changed. The temptation for that manufacturer will be to program that device to ignore certificate validation errors to keep the device that’s already in production running smoothly. However, this is a bad choice. If that device were to ignore certificate validation errors, it would potentially not be able to detect a malicious actor attempting to impersonate a legitimate service.
If the private key of an end device is compromised, then only that device is impersonated. That device is associated with only one user account, and the device can’t leak control of other devices associated with other accounts. Also, that device certificate can easily be revoked. In this scenario, the overall risk associated with losing that key is fairly small, but still undesirable.
If the private key of the firmware signature is compromised, all devices lose the ability to authenticate that the source of any new firmware is legitimate, which can lead to further loss of data, including loss of other secrets. The risk associated with losing that key is high.
If the private key of any of the cloud services is compromised, all devices lose the ability to authenticate that the cloud services they connect to are legitimate, which can also lead to further loss of data, including loss of other secrets. This could also potentially lead to significant loss of control of cloud service infrastructure. The risk associated with losing these keys is very high.
Keys are protected by hardware-based key storage devices. It is important to never store keys or secrets in code, or in files unprotected, or anywhere where they are exposed. As mentioned earlier, devices should always be using something hardware-based like a TPM, or Microchip’s ECC chip, or ARM TrustZone to protect keys. Keys like the certificate private key, firmware encryption key, and firmware signature public key should live in this key storage.
Cloud services can use Key Management Services that are backed by a hardware key storage device. Keys associated with all cloud services should live in this Key Management Service. The firmware encryption key and the firmware signature private key can also be stored in a Secrets Management Service that is backed by the same hardware key storage device as the Key Management Service. You can further protect firmware signature keys by using CI/CD to automate the process of encrypting and signing firmware. This is to reduce exposure of those keys.
It’s important to note a few differences between hardware-based key storage and more generic secrets management tools. A TPM secures private keys in 3 ways:
Like a TPM, Key Management Services will offer a place to keep your keys invisible while they are not in use, and they can also offer to perform certain operations to prevent exposing keys. Tamper resistance isn’t a direct benefit, but Key Management Systems do rely on hardware like a TPM that should have tamper resistance built into it.
Secrets Management Tools differ in that they only provide a place to keep secrets when they are not in use. While they can be protected by hardware like a TPM, they do not perform any cryptography operations on your behalf using your stored secrets. This type of tool should only be used when a secret needs to be known to an application directly for it to do its job. Care must also be taken to ensure these types of secrets are not being inadvertently saved or transmitted outside of the application using that secret when using this type of tool.
We have talked about what goes into cybersecurity from the device to the cloud, including:
You want to make sure your cybersecurity posture is strong from your devices to the cloud, but you don’t have to do it on your own. Schedule a meeting with our team today and turn your secure device ideas into reality.