Streaming Hypothesis Reasoning

AIM will advance streaming reasoning techniques to overcome a limitation in contemporary inference that performs analysis only over data in a fixed cache or a moving window. This research will lead to methods that continuously shed light on proposed hypotheses as new knowledge arrives from streams of propositions, with a particular emphasis on the effect that removing the expectation of completeness has on the soundness and performance of symbolic deduction platforms. The work will address challenges in sampling rates, cache maintenance, deductive reasoning, and ranking of conclusions. Using a fixed-size cache, the challenge is to identify and preserve "interesting" assertions in a stream. Deductive reasoning will continuously be performed over the cache so as to draw conclusions (relevant to proposed hypotheses) as quickly as possible. The use of a cache differentiates our work from state-of-the-art works in deductive stream reasoning in that the cache enables us to "remember" propositions that are no longer in the stream window. Such a capability is potentially transformative to data-intensive science as well as investigative processes that draw from heterogeneous data sources. The primary challenge in the research is in determining which propositions should be cached at any point in time in order to maximize the benefit of deductive reasoning to the user.


View publications

Funding Agency/Sponsor