Architecture Assertions
How WoT implementers help the Architecture specification!
1
Ege Korkan and Michael McCool
Goal of Testing
The WoT Working Group needs to verify the implementability of each specification before it can be published as a W3C Recommendation.
2
Result of Testing
Testing results in a detailed�Implementation Report:
https://w3c.github.io/wot-architecture/testing/report11.html
3
More than 50 lines like this!
Goal of Testfests
The Working Group verifies the implementability of each specification using events called Testfest.
For Architecture, during a Testfest we ask implementers indicate what features specified by the Architecture specification have been implemented.
The next Testfest is scheduled for the week of April 24 and will be online.
However, new test results can be submitted at any time with a PR; it is not necessary to wait for a Testfest.
4
Goal of this Event
We have realized that some of our assertions are not very self-explanatory.
It is not always possible to have multiple sentences explaining an assertion in the specification. We want to use this slideset to provide additional description on how to implement such features. This way, you can start submitting implementation results.
If features are not implemented, we have at risk features.
5
?
At Risk: What does it imply?
When an assertion (i.e. a feature) has less than 2 implementations, it cannot be part of the final W3C Recommendation.
When we publish a Candidate Recommendation, if there is a lack of implementation, that assertion becomes at risk and is highlighted yellow in the specification.
6
Assertion Context
It is not always possible to understand an assertion by itself.
Please also look at the context.
For example, use of TLS depends on the visibility of the system, whether it is protected by other means (e.g. private network), whether it is a greenfield system, the sensitivity of the data, etc.
7
Danger Zone!
Please note that we are looking at features remaining after we have already done significant testing.
These are sometimes difficult to implement, and sometimes are only applicable to special cases.
8
How to Contribute
Instructions: Please follow the GitHub Readme at https://github.com/w3c/wot-testing/tree/main/events/2023.03.DevMtg
Deadline: 10 May 2023
This is the planned Proposed Recommendation transition date minus 3 weeks so that we have time to fix the specification and complete the PR transition process.
If you need help with submitting results, contact Michael Lagally, Ege Korkan, or Michael McCool.
9
Overall List of At Risk Features in Architecture
From https://github.com/w3c/wot-architecture/blob/main/testing/atrisk.csv
Assertions #Implementations Needed Implementer
10
Some of these need 1 and some need 2 new implementations.
Corresponding slides also include the number needed.
Assertion Categories
11
Structure of each Assertion Explanation
12
Assertion ID and Link to Context
What should a developer do
Assertion description from the specification
How many more implementations we need
Network Security Assertions
Given that WoT interactions happen over the network, the usage of secure network protocols should be considered.
13
In the case of implicit access control via access to a common network a segmented network SHOULD be used.
Developer Instructions:
“Implicit access control” means that access to the IoT devices is by access to the network, and they don’t necessarily have their own access controls, or they are insecure (e.g. Basic Authentication on HTTP without TLS). In this case, having a separate (ideally encrypted) network allows access to IoT devices to be controlled separately from other devices. This also protects other devices from rogue IoT devices.
To satisfy this, a set of WoT Things simply need to be deployed on their own network. Many home routers provide secondary “Guest” networks that can be used for this purpose.
Example: In a smart home, you may want to give guests access to your WiFi and external internet access but they should not be able to control your heating or smart speakers. This is easy if IoT devices and “normal” WiFi use different subnetworks, SSIDs, and passwords. For some IoT devices, such as IP cameras, a separate wired network is often used anyway for bandwidth reasons.
14
Need 0 more implementations
When a Thing is made available on a private network then it SHOULD be protected by secure transport such as TLS or DTLS.
Developer Instructions:
You should ideally use TLS and DTLS, even if the device is in a local/private network. The SHOULD here means however it is optional if the network itself already has encryption - but we still need two implementations. Note that there are similar assertions for public networks that already have a sufficient number of implementations.
To satisfy this, deploy at least one Thing on a private network (i.e. a LAN) with TLS or DTLS enabled. Note there are other relevant assertions about the minimum versions of TLS or DTLS necessary that may also need to be satisfied.
15
Need 0 more implementations
In commercial and industrial environments, explicit installation of pre-shared keys SHOULD be used to allow browsers to access local services while using TLS.
Note: “pre-shared keys” should probably be “certificates”; see Architecture Issue 900.
Developer Instructions:
TLS needs certificates to validate servers (and for mutual TLS, clients) but cannot use the Certificate Authority mechanism in local/private networks since the CA system uses publicly visible URLs to use as an “identity”. To get TLS to work on a LAN, you need to manually install “self-signed” certificates where appropriate.
Note that for the HTTP protocol a certificate only at the server would minimally satisfy this assertion, but stronger security using mutual TLS requires certificates at both client and server. Companies often require installation of their own client certificates to control access to services on internal corporate networks.
16
Need 1 more implementation
When secure transport over UDP is appropriate, then at least DTLS 1.3 [RFC9147] SHOULD be used.
Developer Instructions:
If you are going to use DTLS, it should be version 1.3 and up. This would be applicable for CoAPS for example. The difficulty is that DTLS 1.3 is fairly recent and although TLS 1.3 libraries are widely available, DTLS 1.3 libraries are not, even though DTLS 1.3 is based directly on TLS 1.3. However, the main difference is that certain now-insecure cryptosuites that were allowed in (D)TLS 1.2 were removed for (D)TLS 1.3.
To satisfy this, deploy a Thing (e.g. using the CoAP protocol) using a DTLS 1.3 library. Use of a beta library is acceptable.
17
Need 2 more implementations
Runtime Assertions
These features apply to runtimes and more specifically to runtimes for Thing implementations.
18
The WoT Runtime SHOULD NOT directly expose native device interfaces to the script developers.
Developer Instructions:
This is about a Thing implementation on a runtime like node-wot. The “script” (could be a compiled program) that implements the WoT functionality should not be able to access low-level hardware interfaces directly. Instead there should be some intermediate abstraction, such as a library API. This library is a “HAL” (hardware abstraction) layer that has many benefits, including portability, but for security it is a useful place to do sanity-checking.
This is satisfied in practice if hardware registers are not (and cannot be) accessed directly by the “script” - which is generally true for runtimes using languages like Python and Javascript. For C/C++ running on “bare metal” in constrained devices (no OS, no separate supervisor mode…) it is more difficult to make the HAL mandatory, which is why this is optional.
19
Need 0 more implementations
A WoT Runtime implementation SHOULD provide a hardware abstraction layer for accessing the native device interfaces.
Developer Instructions:
This is generally satisfied if arch-security-consideration-avoid-direct is satisfied. However, this assertion is about a HAL being available, while that assertion is about disallowing other ways to access the devices. Obviously you can’t require use of a HAL unless one exists.
20
Need 0 more implementations
Hardware abstraction layers SHOULD refuse to execute commands that might put the device (or environment) to an unsafe state.
Developer Instructions:
This assertion is related to arch-security-consideration-use-hal and arch-security-consideration-avoid-direct but is more strict in the sense that the HAL should also perform safety checks.
As a best practice, embedded IoT devices should have hardware interlocks, not just software interlocks, to avoid unsafe conditions. But if it is possible to put a device into an unsafe state via software, the HAL should attempt to prevent it.
Example: Consider an addressable RGB strip whose power supply cannot drive full brightness on all LEDs on all channels at once. In this case the HAL may compute the total power of a requested configuration and reduce the overall brightness to match what the supply can produce. A good hardware design would also include a fuse. However, the fuse might fail to blow, hence the “might”: The HAL does not need to be the only system attempting to prevent unsafe conditions for this assertion to be satisfied by an implementation.
21
Need 1 more implementation
Post-manufacturing provisioning or update of scripts, the WoT Runtime itself or any related data SHOULD be done in a secure fashion.
Developer Instructions:
If the runtime allows updating the scripts (this term also includes compiled firmware) running on it, or updates to keys (“provisioning”), or related data (for example, extension vocabulary definitions) that update should be “secure”. This means that the entity sending the update should be authenticated and authorized, the update should be signed, the signature should be verified, and secure transport (such as TLS, securely identifying at least the server providing the update) should be used. Note that secure update can be either pull (the device logging into a secure server on the internet, often automatic) or push (an entity/admin logging into the device and uploading an update).
22
Need 0 more implementations
Security and Privacy in General
These are about having security and privacy that could not be categorized into one of the other ones
23
When generating TDs for an IoT ecosystem not covered by the WoT Binding Templates, TD creators SHOULD ensure that all the security requirements of the IoT Platform are satisfied.
Developer Instructions:
Here “ecosystem” generally refers to devices satisfying other standards or part of a proprietary set of devices following de-facto standards.
For example, suppose a Thing Description is actually describing an OCF or ECHONET device. Those standards have their own requirements for security that TDs should not violate.
If such a device is being described directly then the Thing Description should be accurate but it should also satisfy any requirements of the relevant ecosystem standard. For example, if an ecosystem standard says that authentication MUST be negotiated at connection time, then an “auto” security scheme should be used rather than describing the authentication requirements in the TD.
24
Need 0 more implementations
TDs that can be associated with a person SHOULD generally be treated as if they contained PII and subject to the same management policies as other PII, even if they do not explicitly contain it.
Developer Instructions:
PII is “personally identifiable information”. In IT systems, the data associated with people is generally handled with more care due to privacy concerns. You should treat TDs that describe devices potentially associated with people the same way. This is true even if the connection is not explicit or there is no explicit PII in the TD itself. For instance, the set of all TDs in a Smart Home is associated with the residents of that home and can be used to infer information about them, such as health conditions, whether they have children, income, and so on, just based on the number and type of devices.
There are some broad policies around handling of PII which depend on the jurisdiction, but three relevant ones are: (1) “right to be forgotten” (e.g. ability to delete a stored TD), (2) limited retention timelines (e.g. expiry dates upon which a TD is automatically deleted), and (3) controlled access (e.g. not made available publicly), satisfaction of which we can take as satisfaction of this assertion. Note that this applies to TDs, e.g. Thing instances, not TMs, Thing Models. All three policies are satisfied by TDD implementations that satisfy the WoT Discovery specification.
25
Need 0 more implementations