If we talk about robots rights with the people, then maybe it is unthinkable, or they will deny such rights, and the argument often made by most people is because of the way people think about the technology, to begin with, there is an underlying question to which people already have a subconscious answer in their mind, and that underlying question is what is a technology and do they really need rights?
Technology is a tool more than anything else, and the core function is to fulfill its task even without the rights. According to Heidegger’s analysis, the role of technology is that it is a means employed by humans for specific ends. Heidegger’s termed this type of technology as “the instrumental definition” and stated that it forms a correct understanding of any kind of technology contrivance.
No matter how independently these systems behave, they will consider as a product of human behavior and human decision. As a regulator, it makes our life simpler not to treat advanced robots as anything else than simply tools being used by some corporations or people; therefore, the theory applies to simple devices and sophisticated technologies, such as AI and robots. They cannot become moral subjects in their own right, and we should treat them in such a way. Due to all these reasons, Hall points out, we never consider ourselves to have a moral duty to our machines,” and Levy concludes that giving robots having any such rights is unthinkable.
Hence, robots should not be given rights because they cannot have any rights. The reason is they are just mere tools, mere instruments, and we are the one who has all the agency, and they just serve our purpose. Therefore, they should not be given any rights.
The other approach is that robots are able to have rights; therefore, they should have rights. This argument is future-oriented, and in the future, maybe at some point, we will have sufficiently advanced AI. However, it may give rise to many other issues such as robot responsibility and personality, but rights will undoubtedly be a big issue. Since robots are not conscious or ontological criteria count, no worries, but once they achieve the capability, we should have indeed thought of extending some level of respect and moral concern. Haraway states that we cannot climb into other heads to get what they think from the inside. Hence, no one knows what the world looks like from someone else’s mind, which remains a problem for robots, human beings, etc.
The third argument is that robots can have rights but should not and the prominent voice in this context is Bryson, who states that “Robots should be slaves” and “Robots are property.” Hence, we should not give them any rights because they derive from the need to protect humans and social institutions which is if we start granting rights to robots, then the world will become a very chaotic place and have other problems like robots will possess things that only human beings can possess, for example, resources and assets and all this will be a threat for human existence.
Even if we realize that robots can have rights, we will refuse it because it benefits human existence in so many ways. It keeps us in the position of masters or owners and makes our life easy.
In essence, it is possible to create robots with rights, but we should not do so and treat them as artifacts as mere tools and instruments in our service because we are creating something for societal benefits, and this kind of thinking has impacted humans.
The last approach is that robots cannot have rights but should have rights. Kate Darling argues that this will certainly be true when the relationship between humans and robots becomes complex enough that not granting them rights will harm humans. This theory is not true right now, and even if it is true, we are basing ourselves on our feelings and just because we feel they should have rights.
We make robots or AI for human benefits and not to replace our existence. Therefore, what if they do not act in the way we want them to act? Robots are made to serve rather than exploit people, and giving them rights is not a win-win approach because robots may overpower humans. Thus, robots are already doing human jobs, and if we grant them rights, then robots will become super-intelligent, that they will be able to make humans work for them and that we surely do not want it.
Bibliography: –
- David J. Gunkel, ‘The Other Question: Can And Should Robots Have Rights?’ (2017) 20 Ethics and Information Technology <https://d-nb.info/1148505024/34> accessed 20 November 2021.
- Ibid.
- Ibid.
- Ibid.
- Ibid.
- Ibid.
- Ibid.
Aishwarya Says:
I have always been against Glorifying Over Work and therefore, in the year 2021, I have decided to launch this campaign “Balancing Life”and talk about this wrong practice, that we have been following since last few years. I will be talking to and interviewing around 1 lakh people in the coming 2021 and publish their interview regarding their opinion on glamourising Over Work.
IF YOU ARE INTERESTED IN PARTICIPATING IN THE SAME, DO LET ME KNOW.
Do follow me on Facebook, Twitter Youtube and Instagram.
The copyright of this Article belongs exclusively to Ms. Aishwarya Sandeep. Reproduction of the same, without permission will amount to Copyright Infringement. Appropriate Legal Action under the Indian Laws will be taken.
If you would also like to contribute to my website, then do share your articles or poems at secondinnings.hr@gmail.com
In the year 2021, we wrote about 1000 Inspirational Women In India, in the year 2022, we would be featuring 5000 Start Up Stories.