LaDissertation.com - Dissertations, fiches de lectures, exemples du BAC
Recherche

Les failles de l'IA

Dissertation : Les failles de l'IA. Recherche parmi 298 000+ dissertations

Par   •  15 Juin 2023  •  Dissertation  •  2 754 Mots (12 Pages)  •  150 Vues

Page 1 sur 12

[pic 1]

MSc SOC

[pic 2]

[pic 3]

HOME RETAKE EXAM

START DATE: 09/05/2023 09:00am (Paris time)

DUE DATE: 12/05/2023 11:59pm (Paris time)

Course coordinator: Prof. Mohamed-Hédi CHARKI

INSTRUCTIONS:

  • Assignment type:  Individual
  • Type(s) of assignment file accepted:  WORD
  • Number of document(s) expected:  1
  • Number of submission attempts allowed:  2
  • Appendix provided:  No
  • SafeAssign:  this assignment will be checked for plagiarism.


EXAM SUBJECT:

Machine Learning has transformed the way applications are thought, developed, and implemented. We saw evidence for this in the last session of the IT Strategy course. Nevertheless, paradoxically, some machine learning initiatives fail.

You are asked to analyze 2 situations of machine learning failures.

In terms of content this means that you need to:

  • Present in detail what the failure was about
  • Explain the failure
  • Show its impact
  • Provide recommendations to avoid similar failure

You need to provide a document with:

  • 1,5 to 2 page of analysis per failure
  • your text has to 12-point typeface and single-spaced
  • the pages have to be numbered.

Best wishes

Pr Mohamed-Hédi Charki & Pr Peter Saba

  1. Tesla's Autopilot crashes
  • Present in detail what the failure was about

On July 24, 2022, a motorcyclist was killed after a collision with a Tesla which was on autopilot in Utah. According to the National Highway Traffic Safety Administration (NHTSA) investigation, this is an identification problem. The Tesla approached the motorcyclist dangerously, colliding with the rear of the motorcycle, and throwing the individual to the ground, the motorcyclist died instantly. The motorist who was in the Tesla told him to the investigators that he had not seen the motorcyclist. This is not the first time that the autopilot mode has been questioned in road accidents. Accidents involving Tesla's Autopilot are increasingly frequent, indeed out of 392 accidents reported with semi-automatic driving (level 2), 273 were driving a Tesla. Tesla, which nevertheless ensures that the Autopilot is ten times safer than a conventional car.

  • Explain the failure

This failure was caused by a combination of factors, including overreliance on the technology by drivers, the limitations of the current state of autonomous driving technology, and the difficulty of accurately modelling complex driving scenarios. Since the software ignores scene control measures, including warning lights, flares, cones mainly at night when the light is low, the fault can also come from damage or obstructions of the sensors.

Tesla drivers have access to level 2 and benefit, for example, from adaptive cruise control and lane keeping assistance. This is what Tesla offers with its Autopilot system.

However for level 2 it is the responsibility of the driver and not that of Testa which is engaged in the event of a dispute. At this level, the car can act autonomously, but as Tesla advices, the driver must constantly monitor the driving and be ready to take control at any moment.

  • Show its impact

Although Tesla insists that Autopilot is not a fully autonomous driving system (level 4 and 5) and drivers must remain attentive and ready to take control at any time, some drivers have relied too heavily on the system, leading to accidents.

The crashes have led to injuries and fatalities such as death of motorists and passengers, as well as damage to Tesla's reputation and public trust in autonomous driving technology.

These accidents have a real negative impact both on investments in the development of autonomous cars and on the anchoring of public opinion to the adaptation of this technology.

However, we have to put Tesla's Autopilot crashes into perspective since we assess the risk based on the danger and the fear felt. The risks that we can control frighten us much less than those that escape us. This is why we tend to fear the plane more than the car, while the death rate – which is itself very low – is almost the same. We have a similar case for self-driving cars.

  • Provide recommendations to avoid similar failure

Based on mathematical and statistical approaches, computers can develop the ability to learn from the phenomenal amount of data they can analyse. Machine learning is about the design, analysis, optimization, development and implementation of such methods.

The first phase of machine learning is crucial for self-driving cars since it consists in estimating a model from data which in this case is a limited number of observations or images. This first phase is decisive for the second phase which is the production of the model under the conditions in which it was trained and to see in the case of autonomous cars if the model is well respected as well as all the social mores to implement in upstream. In 2016, MIT launched a questionnaire called "moral machine" to assess the ethical choices of the human being if a self-driving car should kill someone in an inevitable crash. Supervised learning is strongly recommended in the case of autonomous cars to always favor the common good of citizens by classifying decision-making by the autonomous car thanks to the supervised model.

Use a deterministic model (Deterministic data is information that is known to be true and accurate because it is supplied by people directly or is personally identifiable, such as names or email addresses. It's often referred to as authenticated data to have a clearly established answer and chosen by citizen consensus in each of the situations encountered by the car can be a good option.

Use the autonomous car in an ideal operating environment, i.e. in country roads with poor visibility, complicated weather conditions the sensors of the autonomous car can misinterpret the dynamic environment in which they evolve and thus commit the irreparable. This is why certain conditions must be met for the driver to transfer his responsibility to the machine and thus reach the level 5 so feared by autonomous car manufacturers. Given that the current legislation, in the event of a dispute puts the entire fault on the responsibility of the manufacturer for a car using level 4 or more of autonomous driving. That is why all components of the car must not show a defect in any case. From sensors to engine mechanics to the model used by artificial intelligence. Everything must be perfect before the full-scale launch.

...

Télécharger au format  txt (17.3 Kb)   pdf (149.1 Kb)   docx (226.4 Kb)  
Voir 11 pages de plus »
Uniquement disponible sur LaDissertation.com