Principal Research Projects: Constraint-Based Local Search, Network Interdiction, Compressed Data Structures, Modern Academics, Explainable AI. Other research interest includes Graph Theory, Geometry, Optimization, and Games.

All of this research is supported by Open-Source Software and published as peer-reviewed academic papers.

Optimization is a broad topic at the crossroad of mathematical programming, artificial intelligence, bioinformatics and other scientific fields. However, its essence is to model and solve real-world problems by balancing efficiency, accuracy, and resource consumption of the modeling and solving aspects.

In our work, we have designed a constraint-based local search (CBLS) framework providing users with tailored solving strategies. Our belief that solving optimization problems should be primarily about the model and not the solver is the core of JuliaConstraints. We provide a wide range of independent tools to explain our AI choices and guide the user in selecting relevant solving strategies.

Interested users can find front packages for model examples in ConstraintModels.jl and a user-friendly JuMP-based interface in CBLS.jl.

A situation when a network flow is attacked can be modeled by a game between two players: an attacker who wants to remove a set of links that minimizes the flow value and a defender that wants to maximize the flow value. When the attacker (resp. defender) plays first, the problem is called network interdiction (resp. network adaptive flow). Most of those interdiction problems are intractable.

We provide a general framework to solve or approximate interdiction problems in polynomial time, along with Bilevel Mixed-Integer Programs with high accuracy to improve the approximated instances.

External data compression is a compression technique where the explicit data is stored as an external source such as hard-disks, streams, or any combination of distributed devices. The local device stores, in memory or cache, a small amount of explicit data that are the most likely to be used in the future. It also stores information required to stream (by small chunks) the rest of the data from the external source. This last process is called reconstruction.

We use time-space trade-off techniques to execute stack algorithms with external compression in a fashion that provides a linearly longer execution time linear compared to classical stacks while reducing the space used from several order of magnitude.

Current work, available as CompressedStacks.cpp, extends this technique to other type of containers while trying to reduce/erase the reconstruction time. All our algorithms are black-boxes, that is the user only selects a compression rate compared to the original algorithms.

The era of information brings a broad variety of challenges to the treatment of data. The amount of data is obviously one. But finer aspects, such as the structure or the entanglement of data, are a key to understanding and interpreting it. This research provides a set of theoretical and practical tools to structure, treat, and analyze modern data.

We provide the multilayer stream graph, a new data structure that combine multilayer aspects (entanglement) and temporal components. We also designed a new metric for the transmission of knowledge (or influence) within networks of information, including the multilayer stream ones. This metric is particularly useful to detect oddities and differences for items sharing a similar context (for instance articles with the same h-index in the same field).

A set of open-source software supports those among others ModernGraphs.cpp and StaticWebPages.jl.