TL Tech Ltd (TL Tech) are a smart home solution provider, designing, installing and maintaining a range of smart home technology for homeowners, house builders, housing associations and councils. We also create voice application software for use on devices like the Amazon Alexa smart speakers.
This statement has been prepared to document our approach to digital and data ethics and sets out our guiding principles of conduct. Digital ethics is not prescriptive or black-and-white, and therefore every situation or decision should be assessed using these principles as a guide.
Digital ethics focuses on respecting a person’s data, it’s about ensuring that the person and their needs always stays at the centre of how the data is stored, used and shared. It is concerned not so much with what an organisation is permitted do in the digital domain, as governed by rules and regulations such as the General Data Protection Regulation (GDPR), but what it ought to do.
Our commitment is to embed a digital ethics strategy across our organisational structure and processes which ensures that business success and revenue generation is never allowed to take priority over observing ethical principles. Technology is constantly evolving and the balance between ethical and economic value can create tension. We feel strongly that by being transparent in our approach this can help to foster better outcomes for the people we support and better working relationships.
Ethics by Design
Applying person-centred design promotes accessibility and a positive user experience. It also allows key stakeholders to input to the design of data practices and ensure that ethics are considered appropriately. This approach uses risk identification practices to prevent risks and mitigate their impact instead of just reacting to risks as they emerge. If there is a way of providing a product or service without the need to collect data, then this should be the default design, as it will always be inherently lower risk.
Data practices should be to the benefit of, not to the detriment of people, both individuals and wider society. The benefits that a customer or society experience should outweigh, or be proportionate to, the potential risks associate with sharing data.
We partner with academia, scientific organisations, governmental organisations, health and social care organisations, peer companies, charities and patient groups to access the best science/engineering, stimulate innovation and accelerate adoption of digital solutions that transform people’s lives.
We endeavour to perform due diligence on our supply chain and partners digital/data practices and ethics where applicable. We will never work with partners whose ethical values do not align with ours or that may exploit data to an individual’s detriment.
We will always be clear about when technology is used in the place of a human interaction so that users do not feel confused or deceived when accessing our products and services. Users should always be clear about whether they are interacting with a human (organic living entity) or a machine (computer application).
We will provide transparency around how and why our digital products and services generate the outputs they do. This is particularly relevant for Artificial Intelligence (AI) applications, for which the assumptions, working and outputs should be traceable and explainable.
We will endeavour to ensure our products and services are free from discrimination, show social and cultural respect and are consistent with the public interest, including human rights and democratic values. We actively try to avoid AI prejudice and bias, by using diverse and representative datasets and promoting diversity within our developer team.
Continuity of Service
We will seek to minimise the risk of outages affecting services which users depend on and will communicate in advance where there may be potential breaks in continuity of service. Where a service can no longer be sustained, we will seek to support users to migrate to an alternative product or service. In this scenario, users will be consulted as far in advance as is practicable in order to gain their input and views on the best way to manage any transition.
Third Party Data
We do not use third party data to create customer profiles or to target content preferences.
Open Research & Collaboration
We commit to make research freely open and accessible for reuse and show willingness to collaborate with partners, where there is wider societal benefit. All data used in this manner would be completely anonymised and in large enough datasets to prevent data being used to reverse identify users through the characteristics of any data.
User Generated Content
We are committed to advertising responsibly and accurately, by avoiding false and deceptive statements. This includes in-product advertising on behalf of partner organisations, which is thoroughly researched and only establishing partnerships with organisations whose values align with ours.
We will apply effective governance and oversight mechanisms for all our product and service delivery. We operate with accountability and accept feedback and complaints. We will only use data in a manner which respects privacy, transparency and accountability to the individual.
If you have any questions about this digital ethics statement (or our data privacy practices which you can access by clicking here), please contact us by telephoning us at 07825 586 731 or by writing to us at email@example.com or Wardhead, Sunnyside Of Folla, Rothienorman, Aberdeenshire, Scotland, AB51 8UL.
We keep our digital ethics statement under regular review. This version was last updated on 06/02/22.