Hidden Markov models offer a flexible framework for modeling data with complex dependencies in many fields, such as signal processing, genomics, epidemiology or ecology. In the context of partially observed time series, it is assumed that observations reflect a hidden Markovian dynamic. We will consider the case where the law of the observations knowing this dynamic, as well as the transition law of the Markovian dynamics belong to a parametric family. The inference objectives are then to 1) predict the distribution of hidden states knowing the observations, 2) estimate the unknown unknown parameters. Numerous algorithms exist to accomplish such tasks, notably in the maximum likelihood framework. In this presentation, I’ll talk about recent work on the development of variational inference algorithms within this framework, with theoretical results in the field. I will also discuss a new algorithm linking Monte Carlo and optimization methods for deep learning.