In the past years, I was fortunate enough to be in contact with many brillant researchers who introduced me to a variety of research topics and influenced (and still have an impact) on my research. I currently have different, but related research interests:

I detailed them below:

Proof Theory and Logical Frameworks


Proof theory has offered much to computer science. The Curry-Howard isomorphism establishes a connection between the functional programming paradigm and normalization of proofs (cut-elimination). The Focusing Discipline establishes a proof theoretic explanation to logic programming using the computation-as-proof-search paradigm.

Dale Miller introduced me to this world during my PhD. We worked on understanding concepts and operators for which logic does not establish a canonical form, namely, the polarity assignment of atomic formulas and the linear logic exponentials.

Polarity of Atoms: In a focused systems logic connectives are classified as positive, if their introduction rule is not invertible, and negative otherwise. While one can extend this classification to non-atomic formulas, e.g., formulas are positive if their main connective is positive, it is arbitrary for atomic formulas. In fact, one can chose arbitrarily which polarity atoms are assigned and the focusing discipline will still be complete. Interestingly, however, the choice of polarity does have an important impact on the shape of the focused proofs available. For instance, the proofs of the focused proof system for LJ correspond to forward or backward-chaining proofs when assigning positive or negative polarity to atoms.

We investigated the type of proofs that one obtains when using linear logic as a logical framework and using different polarity assignments. We showed that a number of proof systems, such as LJ and NJ, can be encoded in linear logic by playing with the polarity of atomic formulas [JAR’10]. We obtained a very strong notion of adequacy, an adequacy on the level of derivations. This results establishes that proof search in the encoded systems corresponds exactly to proof search in the linear logic framework. We also used polarity of atoms to formalize a notion of table as proof in our CSL’07 paper, but extended considerably in my thesis.

It seems that the polarity of atoms may be used to capture a number of other features proof-theoretically. We are investigating possible applications.

Subexponentials: It is known, since the invention of linear logic, that the linear logic exponentials are not canonical. Danos et al. proposed a refinement of linear logic proof system, SELL, that allowed for many colored exponential-like operators for which we call subexponentials [PPDP’09]. We proposed a logical framework called SELLF, which is a focused proof system for SELL [PPDP’09]. We showed that it increases considerably the expressiveness of linear logic when used as a logical framework. In a series a papers [PPDP’09, LICS’12, CONCUR’13, ENTCS’14, JLC’14], together with Elaine Pimentel, Giselle Reis and Carlos Olarte, we have investigated the power of SELLF. We have shown that a great number of systems can be encoded in SELLF:

Tools for Proof Theory: These theoretical results have led us to start an effort of automating the proofs of many foundational results in the form tools. Two such tools have been implemented:

There is still plenty of room for research in these topics. At the long term, we would like to build an extensive theory that can explain how to build systems with a clear proof theory and decide among the possible choices for specifying systems. This theory will be accompanied with sensible tools that will help specify and verify these logics/systems.

Foundations of Computer Security


I have been fortunate to do my first post-doc with Andre Scedrov. He introduced me to the exciting field of Computer Security. We have been working together since then together with Max Kanovich and Tajana Ban Kirigin on a number of topics including the foundations of computer security.

Protocol Security: In symbolic protocol security verification, one considers a very powerful intruder called the Dolev-Yao intruder. He acts as the network with the capability of intercepting, sending and composing and decomposing messages as long as he knows the right decryption keys (encryption is perfect).

The Dolev-Yao is indeed powerful. He can for example remember as many facts as he needs. This fact seems to be key for the undecidability of verifying whether a protocol is safe, e.g., checking whether the intruder can discover or not some secret, also called the secrecy problem. We proposed a less powerful intruder with bounded memory [FAST’10,IC’14]. We showed that the secrecy problem is decidable (PSPACE-complete). We did not make any assumptions on the number of nonces that could be generated, but did make the assumption that all protocols are bounded memory, which means that there is a bound on the number of simultaneous protocol sessions.

Interestingly, many existing protocols are memory bounded and we showed that many known anomalies of such protocols could be also carried out by our bounded memory intruder, provided he has enough memory [FAST’10,IC’14]. This lead us to the question of whether bounded memory intruders could approximate the traditional Dolev-Yao intruder, which has an unbounded memory, when considering only bounded memory protocols [ESORICS’13, COMLAN’14]. We showed that this is not the case. The secrecy problem is undecidable when assuming the Dolev-Yao intruder even in the presence of bounded memory protocols.

Collaborative Systems: In a collaborative system, agents work together to achieve a goal, but at the same time try to avoid that bad things happen, i.e., that bad states are reached. We have been working on different problems for collaborative systems, for instance, the planning problem: Is there a sequence of actions (plan) that leads from an initial state to a goal state without reaching any critical state?

We investigated systems with different requirements from both a foundations point of view as well as from the implementation point of view, with the help of Carolyn Talcott. We started with systems that required the creation of nonces, such as in administrative systems: whenever a transaction is created a fresh number is assigned to it so to uniquely identify it. We showed that under some conditions, namely that actions are balanced [Rowe’s PhD Thesis], then the planning problem is PSPACE-complete. If we allow branching actions, that is, actions that may have non-deterministic effects, then the planning problem is EXPTIME-complete [RTA’12].

We also considered systems that mention discrete time explicitly, such as Clinical Trials. There are many challenges in including time for the verification of the planning problem. We tackled them by elaborating conditions on the type of actions available [RTA’12] and showed that the complexity results shown for untimed systems remain the same. We also argued that such formalization could be used for monitoring clinical trials [IHI’12] and build a prototype of a Clinical Trial Assistant using Maude.

There are many directions to this work. For instance, we have not considered more elaborate systems, where spatial modalities are needed. Also, we are also considering systems which require dense times, instead of discrete times, e.g., distance bounding protocols. We are also in contact with the clinical trial industry collaborating to implement a more sophisticated Clinical Trial Assistant.

Denotational Semantics:


Working with smart people always helps. This is the case with Martin Hofmann and Nick Benton, specialists on denotational semantics for programming languages. The Denotational Semantics literature is a very rich and exciting and I am still taking my baby steps to grasp the theories available and their power and differences.

Our main goal is to justify effect-based program rewrite equations using denotational semantics. The following is an example of such equation:

        Gamma |- e1;e2 = e2;e1

provided that e1 and e2 reads and writes in disjoint locations (or regions).

Proof Relevant Logical Relations: Allocation poses many problems when trying to find suitable denotational models for programming languages with such feature. Existing models have been found to collapse under fixpoint constructions. The main problem seems to be spurious existential quantifications. We have provided [TLCA’13] a proof-relevant logical relations and showed that it can be used to solve these problems. We formalize worlds and their extensions in a categorical setting, using setoids and pullback squares. We are then able to justify the soundness of some equations and provide a denotational model for programming language with fixpoint constructs and name generation.

Abstract Effects: We introduced the notion of abstract effects to justify coarser annotation of effects to programs [POPL’14]. Consider for example a data structure that stores and manipulates sets, stored as linked lists. Such data structures come with functions to add and remove an element from the set as well as a membership function that checks whether an element belongs to a set. We are able to justify by using abstract effects that a membership function is assigned only read effects although it not only checks whether an element is in the linked list performs optimizations, such as remove duplicates, thus writing in the data structure. We were also able to prove the soundness of a number of effect-based rewrites, such as the program commutation equation shown above.

There is a number of exciting directions from this work. For example, understanding abstract effects in concurrent programs, or investigating ways to automate the proof of some soundness proofs. We have until now not considered high-order stores and finding suitable models for such cases is also topic of future work.

Formal Methods applied to Networking:


During my post-doc at UPENN, I started investigating applications of formal methods for the verification of network systems. I was involved in Boon Thau Loo’s Network Datalog project. This is a fascinating project on using declarative programming languages for implementing routing protocols. During my stay, we formalized the operational semantics of the core of a Datalog-like language for distributed systems and proved that the its correctness with respect to usual centralized Datalog semantics [PPDP’11,COMLAN’12]. The main challenge is that the evaluation in a distributed system is not synchronous. Thus one needs to prove that the evaluation in the distributed setting is correct w.r.t. the centralized Datalog semantics for any interleaved computation of distributed processes. In the literature, one handles this concurrent computation by assuming that message transmission is FIFO, which is of course a strong assumption. We prove that our operational semantics works even when this assumption is lifted.

In a different line of work [TON’12], I was also involved in the construction of a formal framework for the verification of the convergence of routing policies. It is known that if routing policies of ISPs are conflicting then one cannot guarantee that the routing tables will converge. We proposed a framework that from a given network topology and routing policies, it can check whether the routing tables always converge in an automated fashion. For that we reduced this problem to a problem solvable by an SMT solvers.

Finally, more recently, I have started to work with Prof. Iguatemi Fonseca on DDoS attacks. More specifically, we have investigated the use of formal methods for the analyses of defences against Application-Layer DDoS attacks. The difference to traditional attacks is that Application-Layer DDoS attacks can be used to target a specific application in a server, such as as web-server, thus generating less traffic and being harder to be detected. We have proposed a defence [EISIC’14], called SeVen, based on the Adaptive Selective Verification (ASV) defence for Network-Layer DDoS attacks. The main difference is that while ASV assumes a simple stateless sync-ack client-server communication, which is enough for mitigating Network-Layer DDoS attacks, SeVen is state-dependent, assuming more complicated communication patterns. We formally analysed SeVen, evaluating its performance to attacks using the HTTP protocol, such as the HTTP-Pragma and HTTP-Post attacks [EISIC’14]. We have also discovered a novel attack using the VoIP prototol SIP. We also analysed formally SeVen for mitigating these attacks [SUB’14].