Offers “CEA”

44 days agoCEA

Stage de fin d'études en optimisation de l'IA pour l'obtention d'un Master 2 ou diplôme d'ingénieur H/F

  • Stage
  • Palaiseau (Essonne)
  • IT development

Job description

Vacancy details

General information

Organisation

The French Alternative Energies and Atomic Energy Commission (CEA) is a key player in research, development and innovation in four main areas :
• defence and security,
• nuclear energy (fission and fusion),
• technological research for industry,
• fundamental research in the physical sciences and life sciences.

Drawing on its widely acknowledged expertise, and thanks to its 16000 technicians, engineers, researchers and staff, the CEA actively participates in collaborative projects with a large number of academic and industrial partners.

The CEA is established in ten centers spread throughout France

Reference

2024-34106

Description de l'unité

The French Alternative Energies and Atomic Energy Commission (CEA) is a key player in research, development, and innovation. This technological research organization is active in three major fields: energy, information and health technologies, and defense. As a recognized expert in its fields, the CEA fully integrates into the European research area and continues to expand its international presence. The Laboratory for Systems and Technology Integration (LIST), located in the Ile de France south (Saclay), is tasked with contributing to technology transfer and promoting innovation in the field of embedded systems. Within LIST, the Embedded Artificial Intelligence Laboratory (LIAE) is responsible for designing, developing, and implementing optimized solutions (surface, consumption, computing power) for embedded systems.

Position description

Category

Engineering science

Contract

Internship

Job title

Stage de fin d'études en optimisation de l'IA pour l'obtention d'un Master 2 ou diplôme d'ingénieur H/F

Subject

Combine Token Pruning and Mixed Precision - A Dual Strategy for Efficient Vision Transformers

Contract duration (months)

6

Job description

This internship proposes to explore a dual approach to optimizing ViTs by combining two complementary techniques: Token Pruning and Mixed Precision. Token pruning aims to reduce the amount of information processed at each layer by dynamically removing redundant or irrelevant tokens, thereby alleviating the computational load without significantly compromising performance. At the same time, mixed precision lets you use lower-precision number formats (like going from 32-bit precision to 16-bit or 8-bit) to save memory and speed up computations. This is possible while still keeping enough accuracy for vision tasks.


The goal of this internship is to design, implement, and evaluate the effectiveness of a dual approach within a vision transformer model to achieve an optimal balance between computational efficiency and predictive performance. The laboratory, which has experience working with quantified Vits models, has already developed a token reduction approach that has shown promising results for semantic segmentation tasks. The adaptation of state-of-the-art solutions will be applied at different levels: at the encoder level, by integrating mixed precision quantization of operators, and at the decoder level, by adapting the model head to the quantized encoder to ensure consistency in information processing. Finally, benchmarking tests (FPS, mIOU, Params, MACC, FLOPS) will be conducted on an embedded NVIDIA Orin card to evaluate the generalization capabilities of the token reduction model.


In this context, the objectives of the internship are:

  • A survey of the techniques for token reduction
  • A survey of the techniques for mixed precision quantification;
  • Benchmarking tests (FPS, mIOU, Params, MACC, FLOPS) of models with selected optimization techniques;
  • Develop a new frugal approach that challenges the state-of-the-art (SoTA);
  • Implementation on embedded chip type NVIDIA Jetson Orin.

#Token #TokenPruning
#MixedPrecision
#VIT #VisionTransformers #EfficientVisionTransformers
#ModelOptimization
#DeepLearning
#NeuralNetworks
#AIOptimization
#MachineLearning
#ModelCompression
#ReducedComplexity
#EnhancedPerformance

Methods / Means

pytorch, git

Applicant Profile

Requested profile: Master degree (BAC+5)

Position location

Site

Saclay

Job location

France, Ile-de-France, Essonne (91)

Location

Palaiseau

Candidate criteria

PhD opportunity

Oui

Requester

Position start date

03/02/2025


Make every future a success.
  • Job directory
  • Business directory