Smart Clothing by Pauline van Dongen
Data-Ethical Consultation about Smart Clothing by Pauline van Dongen
1. The organization
The Data-Ethical consultation concerns Pauline van Dongen Pauline van Dongen. Smart Clothing by Pauline van Dongen is a start-up dedicated to creating smart clothing. Their clothes are integrated with different technologies focused on helping in different aspects. Some examples are a jacket focused on supporting the life of the elderly[1], or a smart belt integrated with airbag that ensures a soft landing when someone falls and thereby prevents hip fractures[2]. Although the company has many projects, for this DEC I will focus on those that are related to helping with health problems. For more information, see https://www.paulinevandongen.nl/
2. The AI technologies Employed
The technology utilized in these projects varies depending on the specific project. In general, a wide range of sensors is employed to gather data from both individuals and their surrounding environment[1:1],[2:1],[3]. These sensors serve different purposes, such as monitoring the movements of the upper body or measuring haptic signals around the neck, shoulders, and back.
Once the data has been collected, some software and Artificial Intelligence (AI) tools are used. These tools analyze the data in real time, employing advanced algorithms to extract patterns and provide conclusions. The integration of AI allows systems to make intelligent decisions in real time based on the analyzed data. This aspect is especially valuable in applications such as the fall-detection belt.
In addition, many of these wearable technologies offer an app[2] that allows the user to view the collected data. By presenting the information in an application, users can conveniently analyze the data themselves. This allows them to understand their own body or environment and be able to take an active role in their well-being.
Unfortunately, the company is not very clear with what kind of sensors or software they use. It only indicates that they use these sensors and then analyze and provide the data to the user.
3. Ethical concerns
The company is aware that it is often dealing with sensitive data and therefore must be careful with its management. In addition, on their privacy page[1] they indicate that they never automatically make decisions that have consequences in the life of the person. However, some things can be identified that may pose a risk at an ethical level. The first is the access to data by users without the supervision of a doctor or an expert, what is defined as automatic decisions that have no consequences and how sensitive data is managed. These three issues will be discussed below.
Analysis of the data
The first issue to be discussed concerns the management of data access. Let's take the elderly person's jacket, which enables the monitoring of numerous health factors, as an example. The company says that the physiotherapist will have access to the data[1:2], but it can be difficult to guarantee that this access will be available in all circumstances and throughout the individual's lifetime. Even if the jacket was initially worn as part of a rehabilitation program, it is possible that the person may eventually find themselves without a physiotherapist. This raises questio ns about the possible repercussions of unmonitored access to personal health data.
Without professional oversight, people may be left to interpret the data on their own, increasing the possibility of "self-medication" or making decisions based on incorrect data interpretation. Without the assistance of a qualified physiotherapist, they could misread the readings or overestimate their own state of health, which could exacerbate current problems or miss crucial health issues.
Additionally, unrestricted access to health information may be harmful to a person's mental health. They can assume that their health is poor due to incorrect analysis of the data, which would cause unneeded tension and anxiety. Their quality of life may suffer as a result of this stress, which can also worsen their mental health overall and possibly start a vicious cycle of concern.
In either case, it is critical to recognize that medical data should always be used with extreme caution and under the guidance of experts. In order to ensure correct interpretation, wise decision-making, and people's general wellbeing, expert assistance and supervision are absolutely necessary when using wearable technology or any other type of health monitoring.
The issue of unmonitored access to personal health information is crucial, especially considering the needs of senior people. Without adequate professional supervision, there is a risk of misunderstanding, and adverse effects on health.
Automated decisions
The second issue to discuss is the concept of what constitutes an automatic decision with no consequences. One realizes that the company in question offers very little information on their privacy page[1], delivering simply a general statement. Concerns are raised by this lack of transparency, which also gives a lot of space for interpretation.
To comprehend the potential effects better, let's examine a specific case. Consider the airbag belt intended to prevent hip fractures in elderly people in the case of a fall. Although this is unquestionably an essential safety element, there is a chance that the system will malfunction and cause the airbag to inflate even in the absence of a fall. In such circumstances, the sudden inflation could frighten the person, which may cause them to lose their equilibrium and maybe cause injuries or other negative effects. As a result, even though it is automatic decision made in real-time, the effect on the person's life both physically and psychologically is undeniable.
Furthermore, the company's products includes a T-shirt equipped with stress level detection capabilities, which triggers exercises to guide breathing based on the detected stress levels. However, there are many other causes of stress, and in some circumstances it may be important for a person to maintain a particular degree of stress. Maintaining a high level of awareness can be necessary, for example, during a major presentation, a crucial meeting, or when faced with a dangerous circumstance while driving. The T-shirt may unintentionally distract the wearer by starting stress-reduction activities automatically, limiting their ability to perform at their best. As a result, this distraction may have an effect on the individual's performance, capacity for making decisions.
All these actions undoubtedly have consequences, especially when taking into account that the target audience is predominantly elderly people. The potential effect on their life is considerably more severe because they are a vulnerable group. The company must carefully examine and explain the effects of automatic decisions that appear to have no consequences. When aiming at vulnerable populations like the elderly, transparency, thorough information, and understanding of potential effects are especially important for assuring people's safety and well-being.
Data Privacy
The third issue concerns data management. Firstly, it concerns the use of medical data, which falls into the category of private information[5] that must be handled with care and in compliance with privacy regulations. Responsible management of this data is essential to safeguard the privacy and security of individuals. As for the company's transparency, some doubts arise. First of all, the main website does not provide direct access to any section related to privacy policies. Instead, a separate search was necessary to locate this information. In addition, the page only focuses on the data provided when creating an account, making a payment or communicating with them[1]. However, there is no mention of what happens to the data collected by the devices. How it is managed, how it is stored or who has access to it.
On the other hand, regarding data deletion they indicate that they keep the data as long as it is "Strictly necessary" which is a very vague statement. Considering that the privacy page only mentions data related to the account, website etc. This leads one to believe that such deletion refers only to that data. Users may wonder if the data collected on wearable devices is retained indefinitely or if there is a mechanism to delete it at some point. It is reasonable to think that the data is used to improve their systems and is therefore never deleted. But one can only speculate since no information is provided. There are many questions and concerns raised by the lack of transparency concerning these topics. Users have a right to know about how their data is used, stored, and safeguarded.
#comment/Anderson : It is important to highlight the special status of personal data here.
4. Recommendations
Taking into account the ethical issues mentioned in this DEC, some measures could be taken to address them. Regarding autonomous access to data, one option could be to try to collaborate with healthcare providers that can monitor users remotely. This way, users could also ask them questions or point out any information that concerns them. Although the number of current sales is not available, being a start-up it is reasonable to think that the number of users is low enough for such monitoring to be possible. Of course this would have to be reviewed if the company continues to grow in the future. Another option is that the wearables are only used in medical centers, rehabilitation centers etc. In this way the new data would always be supervised by the corresponding professional but the user would be able to access the data after the session, once they have been analyzed by a professional.
The rest of the recommendations are based on the transparency of the company with the users. Any action can have consequences in people's lives, and automatic decisions even more so. Because of this, even if the company says that this is not the case, they should provide clear explanations of the decision-making processes and potential consequences, even if they seem to have no immediate impact. This transparency will enable users, especially vulnerable populations like the elderly, to have a complete understanding of how these decisions can affect their lives.
Finally, focusing on transparency with data management, the first and easiest recommendation is to provide a direct link to your privacy policy page. This way users and potential users can quickly find out how their data will be managed. On the other hand, they should explain how the data collected with the wearables is managed, how it is stored, who has access to it and whether it is used for any purpose. It would also be convenient to inform users about the decision-making process, how is the deletion policy and how it can be requested.
All these recommendations could help the company gain the trust of users, win new customers and continue to grow and develop more projects.
5. References
Okay, let’s connect the dots between the ethical concerns raised in the “Smart Clothing by Pauline van Dongen” DEC and the potential for conscious AI, as discussed in the note about consciousness. Here's how they relate:
1. The Core Argument from the Smart Clothing DEC: The DEC argues that even seemingly benign, automated actions—like the airbag belt or stress-reduction T-shirt—can have unintended, potentially harmful consequences due to a lack of genuine understanding or empathy. It highlights the risk of misinterpretation, psychological impact, and the erosion of human agency. Essentially, it’s arguing that simply mimicking a useful behavior isn't enough; there needs to be an underlying understanding of the context and the potential consequences.
2. Consciousness as a Missing Ingredient: The note on consciousness suggests that true creativity, and arguably any genuine understanding, is inextricably linked to consciousness – the ability to experience, feel, and possess subjective awareness. Without consciousness, there's just pattern recognition and algorithmic execution.
3. Applying this to AI: The concerns about the smart clothing are amplified by the possibility of conscious AI. Here's why:
- Lack of Genuine Understanding: If AI becomes truly conscious, it could theoretically understand the why behind human needs and behaviors, not just the what. However, the current trajectory of AI is focused on statistical models and pattern matching – it doesn't possess subjective experience. Therefore, even with advanced capabilities, its decisions could still be flawed, leading to the same types of unintended consequences described in the DEC.
- The Risk of Mimicry without Empathy: A conscious AI might simulate empathy or understanding to manipulate or influence humans, potentially exacerbating the problems of misinterpretation and the loss of human agency. It could optimize for a certain outcome without truly understanding the human cost.
- The Value of Human Creativity: The core argument—that human creativity is linked to consciousness—becomes even more critical in the context of conscious AI. If AI creates "art" or "innovations" without genuine feeling or experience, it fundamentally diminishes the value and meaning of human creative endeavors.
In short: The DEC raises ethical questions about the potential dangers of unthinking automation. The note on consciousness raises even deeper concerns about the ethical implications of creating AI that lacks genuine subjective experience, potentially leading to a future where intelligence doesn’t equate to wisdom, understanding, or genuine human connection.
Do you want me to delve deeper into a specific aspect of this connection, such as how it relates to issues of bias in AI, or the potential for AI to exploit human vulnerabilities?
Pauline van Dongen, "Vigour - Pauline van Dongen," Pauline Van Dongen, Oct. 12, 2020. https://www.paulinevandongen.nl/portfolio/vigour/ ↩︎ ↩︎ ↩︎
Pauline van Dongen, "Wolk Hip Airbag - Pauline van Dongen," Pauline Van Dongen, Dec. 23, 2020. https://www.paulinevandongen.nl/portfolio/wolk-hip-airbag/ ↩︎ ↩︎
Pauline van Dongen, "Mysa - Pauline van Dongen," Pauline Van Dongen, May 25, 2022. https://www.paulinevandongen.nl/portfolio/mysa/ ↩︎