Ethical talks #2 : Employability and free software

šŸ•’ 2 year(s) ago

Hello !

We held a debate on employability in the field of free software and its ethical implications with Carl Chenet, following his conference on 19 November at the 42 school.

The debate was restricted to a small circle and brought together 13 people around this theme.

We initially wanted to focus the debate on specific points, but the participants were able to lead the debate without our intervention.


Brume took care of taking notes and writing the report.

Pohl translated the document in English.


Here are some useful links for the topic addressed:


Introduction by Carl Chenet

Generally speaking, the purpose of a company is to earn money. Everything is concrete, pressing, in case of problems, we make temporary solutions over temporary solutions... The notion of ethics is therefore misunderstood, there is a certain distance between companies and ethics. Thus, to choose which company you want to work for properly, you have to estimate your personal level of ethical tolerance in relation to the company.

Student reaction: In my opinion, a company has to make money, to be lucrative. Not everything that is ethical pays off, it is even often the opposite, so companies are not interested in ethics.

Carl's reaction: With the popularization of social networks, the influence on the Internet, the image we show is extremely important. Therefore every organization tries to be really careful about its image, in order to appeal to the users. Ethics being mostly well regarded, companies often tend towards an ethical image, and therefore ethical actions. Social networks could therefore be a tool to promote ethics.

The word "ethics" is used a lot, but what is its definition? We could take the example of an energy company. It would be ethical if it had ecological cohesion, a certain respect for well-being, the desire not to damage the environment. Ethics is profitable in the long term, but not in the short term, which would not match to the vision of a company that carries out temporary actions over temporary actions.

A student sees wage-labour as a way to earn money, without any other interest, extremely pessimistic.

Another student says they are interested in companies with a foothold in libre software and culture, such as Nextcloud, because these kinds of companies look ethical, which would be appropriate for them.

Proposed alternative definition of ethics: A set of moral and social values. In some companies, there is a real ethical purpose. Unfortunately, there are often "capitalist abuses". This is the case with the GAFAM, for whom personal data is of real profit interest, despite their primary use. A company like Nextcloud, to take the example mentioned above, does not have such a lucrative interest, so it has no direct interest in selling personal data.

Carl gives another example: the case of Google. In this company, employees are treated extremely well. So could we consider it ethical? Yet these employees rebelled [Dragonfly project]. This suggests that the definition of ethics is subjective.

The case of Cambridge Analytica was mentioned: a private company that used public Facebook data to establish statistics on the political opitions of American citizens, and then use them to influence the vote during presidential elections in order to reduce the rate of white ballots and astention, through targeted advertising on Facebook users.

Carl tells us: He was a sysadmin in a company that creates blogs. He received a letter rogatory (a legal act whereby a judge requests information, without the possibility of refusing). He had to give personal information on users, more precisely report and certify the content of blogs. It was against his ethical values, so Carl didn't want to work with these judges. However, he was told that he had to obey his company. So he decided to say no, and quit his job.

This shows that there can be a real confrontation between legal and ethical issues.

Many companies have an ethics charter that is supposed to represent the company's vision on ethics. This can give you a first impression of the company you are contracting with. In Carl's case, the request seemed horrendous and not in line with this ethical charter.

A student asks if such a request is legal. Carl replied that it is. Many students are surprised. Others explain the principle of "Fiche S" (e.g: Registered for suspicion of terrorism in France) to them: users can be carefully monitored as soon as they install Tor, Linux, or post suspicious comments on social networks. As in the United States with subpoenas, it is not allowed to refuse this kind of request for personal information in France.

The debate is centered back on a question: how to avoid that a company's pursuit of profit gets in the way of the free software development process?

It is firstly a question of personal ethics: a person or community developing free software will generally have a certain ethics that will prevent the pursuit of profit (otherwise, the software would not be free).

It is also possible to have a desire to change, to impact society. It is then possible to join a company in order to slightly impact your colleagues, your environment, the world around you...

A student asked whether this would be exclusively valid for small businesses. The answer is no: all large companies use free software. There are many reasons for this: maintenance, price, efficiency... but not ethics: developers sometimes use free software to create proprietary ones.

Example of "Freebox" (a French internet provider's router) on which it was written "for rent" because Free used the Linux kernel with modifications not publicly available. Because of this, the license did not allow the products to be sold, so they were leased.

There are other cases where developers take a free software, change the logo, the name... and make it into a proprietary and expensive product. This demonstrates that licences are not always respected.

This justifies in particular the recent success of licenses such as the MIT license, because they allow to "close" a free code, which the GPL license does not allow. There are other alternative licenses, considered non-free, such as the MPL, which is a "non-violent" license: it prevents violent use of the code. For example, it is prohibited to use a code under this license to perform a murderous program.

A question then arises: If a company can pay people to work on the Linux kernel full-time, therefore on a free software, but with the objective of implement it on drones that will kill people, is that ethical, or not?

Unfortunately, this question does not really have an answer.

Developers do not "just" write programs. Society does not realize the importance of digital technology. This is a huge responsibility on the backs of developers. Each developer can be seen as one of the cogs of a huge machine: each cog taken individually is harmless, or at least believes it is, when in reality, the whole machine can be extremely harmful. This is why there is a great need for awareness and ethics in the computer industry.

A student asks: but what if a developer doesn't know what they are doing? What if it's ignorance? Are they still at fault?

No matter the situation, the "fault" is there. There is virtually no difference of outcome, regardless of whether the person is ignorant or malicious. Example of the screwdriver: the person who invented the screwdriver is not at fault or even responsible for the person who designed a bomb with the screwdriver.

Future developers must therefore be sensitized. It is necessary to be aware of our actions, of what we are capable of doing, to think about the consequences of these actions, of what our code could be used for.

A student warns against achieving technical prowesses at the expense of what will be done with them: they give the example of a friend of theirs who was very happy at the idea of having succeeded in creating facial recognition software for a large company, because they had taken great pleasure in making it.

Society dehumanizes people, infantilizes them. They are not invited to think about what their code will be used for. They are not included at all in the use of their work.

Carl agrees: it is important to measure the impact of your actions, to ask yourself if it is still ethical. Sometimes developers are not even informed of the finished product they are developing for.

Example of Netflix: a great platform for the users. It's practical, simple, there are many choices... However, behind it, there is a significant ecological impact: the quantity of stored videos, the majority of which will only be viewed by a minority of people, requiring a huge number of servers, which leave a significant ecological footprint.

We can rarely predict the future use of what we are currently building. It is therefore important to keep a course. When working as an employee, there is certainly a subordination link, but nevertheless, the developer has a lot of responsibilities.

A student presents another point: lithium, a crucial electronic component, which is becoming a scarce resource, to the point where wars are being started to obtain it. It is therefore questionable whether designing an energy-intensive program, which will require larger computers, is a lack of ethics, since it indirectly "causes" wars. Creating optimized programs could now be required to be considered ethical. Ethics can therefore evolve. To what extent can something be considered an ethical need?

Carl's summary: Ethics is extremely variable and different for everyone. It is everyone's duty, as a human being, to consider the consequences of our actions.

English translation by Pohl.