cv

This is a description of the page. You can modify it in '_pages/cv.md'. You can also change or remove the top pdf download button.

Basics

Name Anass Belcaid
Label Scientist
Email a.belcaid@uae.ac.ma
Phone +212-6 19 32 50 99
Url https://anassBelcaid.github.io
Summary Artificial Intelligence Associate Professor. Competetive programming enthusiat. Chess Lover

Work

  • 2024.05 - 2024.07
    Associate Professor
    National School of Applied Sciences-Tetouan
    Teaching artificial intelligence courses to Engineering students
    • ML
    • AI
    • Maths
  • 2022.12 - 2024.05
    Associate Professor
    National School of Applied Sciences-Safi
    Teaching artificial intelligence courses to Engineering students
    • ML
    • AI
    • Maths
  • 2019.01 - 2022.12
    Associate Professor
    School of Artificial Intelligence Euromed
    Teaching artificial intelligence courses to Engineering students
    • ML
    • AI
    • Maths
  • 2009.01 - 2019.12
    Adjunct Professor
    National School of Arts and Crafts
    Teaching Numerical Analysis method
    • Scientific computing

Education

  • 2015.01 - 2018.01

    Meknes

    PhD
    National School of Arts and Crafts
    Applied Mathematics
    • Optimization
    • MRF
    • Vision

Certificates

Bayesian Methods for Machine Learning
Higher School of Economics 2019-03-20
Introduction to Deep Learning
Higher School of Economics 2019-03-04
Statistical Learning
Stanford University 2018-09-04
Algorithms: Design and Analysis 2
Stanford University 2014-08-01
Algorithms: Design and Analysis
Stanford University 2014-07-01
Digital Signal Processing
Coursera 2014-07-01

Publications

Skills

Artificial Intelligence
Machine Learning
Vision
Deep learning
Time Series segmentation
Hierarchial Learning
Maximum Entropy learning

Languages

Arabic
Native speaker
French
Fluent
English
Fluent

Projects

  • 2024.01 - 2027.01
    Maximum Entropy learning
    Entropy is an old concept in physics. It can be defined as the measure of chaos or disorder in a system[1]. Higher entropy means lower chaos. It is slightly different in information theory. The mathematician Claude Shannon introduced the entropy in information theory in 1948. Entropy in information theory can be defined as the expected number of bits of information contained in an event. For instance, tossing a fair coin has the entropy of 1. It is because of the probability of having a head or tail is 0.5. The amount of information required to identify it’s head or tail is one by asking one, yes or no question — “is it head ? or is it tail?”. If the entropy is higher, that means we need more information to represent an event. Now, we can say that entropy increases with increases in uncertainty. Another example is that crossing the street has less number of information required to represent/ store/ communicate than playing a poker game.
    • Reinforcement learning
    • Maximum Entropy