URCS Projects

Research Area: ai

  • We developed a systematic framework for recognizing realistic actions from unconstrained amateur videos which have tremendous variations due to camera motion, background clutter, changes in object appearance and scale, and so on.

  • Computer vision method for extracting lineal features, both curved and straight, from an image using extended local information to provide robustness and sensitivity.

  • Cluster computing allows standard digital analysis and restoration techniques to be applied to high-resolution microscopic digitizations of Daguerreotypes from the collection of the George Eastman House in Rochester.  Knowing the image context of a feature (such as a small light spot) affects its probability of being noise (dust effect) or signal (foliage effect).  Machine learning can be used to automate some subtle decisions.

  • We developed a user-friendly system to facilitate a user to perform interactive segmentation of objects of interest from a group of related images by providing scribble guidance.

  • The goal of the Laboratory for Assisted Cognition Environments (LACE) is to create advanced computer systems that will enhance the quality of life of people suffering from cognitive disabilities. This interdisciplinary project combines computer science research in artificial intelligence and ubiquitous computing with clinical research on patient care.

  • `Like' has now become a very popular social function on social media networks by allowing users to express their positive opinions of certain objects. It provides an accurate way of gauging user interests and an effective way of sharing or promoting information in social media. We developed a system called LikeMiner using a heterogeneous network model and related mining algorithms to estimate the representativeness and influence of objects.

  • Solving combinatorially challenging planning problems by encoding as Boolean satisfiability and applying state of the art SAT solvers.

  • Methods for translating between natural languages (such as English and Chinese) by training statistical models on large collections of text.

  • The TRAINS project and its successors form one of the longest running research efforts on practical spoken dialogue: conversation undertaken with a specific task in mind.

  • Quagents are intelligent software agents that live in a simulated world provided by the Quake II game engine.

Research Area: hci

  • VizWiz is an iPhone application aimed at enabling blind people to recruit remote sighted workers to help them with visual problems in nearly real-time.

  • WebAnywhere is a free web-based screen reader enabling blind web users to benefit from the availability of public computers. Information on the web can be accessed from any computer that has a sound card without the need to install screen-reader software.

Research Area: systems

  • Many existing programs have dynamic parallelism at the high level but are hard to parallelize because of uncertainty in implementation and program input.  Behavior-oriented parallelization (BOP) provides a suggestion interface for a user to mark possible parallelism and run-time support to guarante correctness and efficient execution whether the hints are correct or not.  BOP is based on frequent, input-dependent behavior rather than definite behavior.   It enables program parallelization based on partial information and is useful for incrementally parallelizing a program or streamlining it for common uses.

  • Our compiler research addresses the twin concerns of correctness and performance, with a focus on recurrence in the use of data—determining whether a complex program has an inherent pattern of data reuse and, if so, to what degree that pattern can be modeled, measured, and modified (improved).

  • This project addresses the challenge of mainstream parallelism using a combined hardware-software approach. The key idea is to identify common time-critical operations, across a variety of applications and programming models, that might be accelerated or simplified by new architectural mechanisms, and then to design those mechanisms in as general a fashion as possible. Candidate mechanisms include the alert-on-update notification mechanism, programmable data isolation, adaptive cooperative caching, and fine-grain access control.

  • Synchronization serves to constrain the interleaving of actions performed by multiple threads of control (e.g., on a multicore processor), allowing only correct executions.  Over the years, this ongoing project has developed some of the most efficient and widely used algorithms for locking, concurrent data structures, and transactional memory.

  • Server systems provide computing or storage to a potentially large number of simultaneous clients.  Our research addresses the increasing emphasis on information (data) rather than mere computing, and specifically addresses system manageability and dependability.

  • This project investigates system-level techniques to better manage parallel and concurrent I/O for high-end computing.  Specific techniques include model-driven performance debugging, multi-level I/O tracing, 2-competitive I/O prefetching, and the exploitation of emerging solid-state storage technology.

  • This project investigates profile-driven performance models for multi-component, data-intensive online service.  It explores a variety of techniques, and has, among other things, identified previously unknown I/O performance bugs in Linux.

  • Reuse distance and program footprint are two basic metrics we use to study the twin concerns of memory system performance and correctness, with a focus on recurrence in the use of data—determining whether a complex program has an inherent pattern of data reuse and, if so, to what degree that pattern can be modeled, measured, and modified (improved).

  • This project monitors computers in the field, in real time, and records memory errors as they occur.  It reveals that soft (transient) errors are orders of magnitude less frequent than previously reported.  It combines the soft and hard (permanent) error rates to predict failure rates and patterns for systems as a whole.

  • Transactional memory (TM) allows programmers to specify operations that should execute atomically, without worrying about how that atomicity should be achieved.  Downloaded to thousands of sites worldwide, RSTM provides a diverse suite of efficient, mutually compatible TM run-time systems.

Research Area: theory

  •  This project studies applications of discrete mathematics in computer science. The topics include combinatorics, counting, coding theory, game theory, learning theory and routing. 

  •   This project focuses on complexity theory.  Among its interests are: Reductions; Resources and Models; Robustness; and the Power of Heuristic Algorithms.

  •  This project studies complexity-theoretic and algorithmic aspects of political science and economics—in particular, of voting theory and game theory. Our work ranges from experimental study of Congressional apportionment to theoretical studies of voting systems and cooperative game theory. We are particularly interested in the ways in which complexity can serve as a tool to protect elections from attacks.

  •  This project studies theoretical problems arising in the area of graph drawing (network diagram visualization). Examples of topics studied are: variants of crossing numbers and their connections, generalizations of the concept of planarity, and algorithmic problems for curves on surfaces.

  •  This project studies counting classes. The term "counting classes'' has come to refer to a certain collection of classes—such as #P, SPP, probabilistic classes, parity-based classes, etc.—that are defined in terms of the number of accepting paths of nondeterministic machines.

  •  This project studies the properties of the semi-feasible sets. A set is semi-feasible (a.k.a. P-selective) exactly if there is a polynomial-time algorithm that given any two elements of the set chooses one, and does so in such a way that if of the two elements exactly one belongs to the set, the algorithm always chooses that one. This can model guided search.

  •  This project studies complexity-theoretic one-way functions, cryptography, and pseudorandom generators. One central focus is seeking characterizations regarding the existance of various types of one-way functions. Also of interest is the extent to which queries can be made without leaking information, and learning more about the connection between foundational complexity-theoretic notions and whether all pseudorandom generators are insecure. This project is in the worst-case idiom, i.e., it studies so-called complexity-theoretic one-way functions.

  •  Everyone knows that it makes more sense to first look up in your on-line datebook the date of the yearly Computational Complexity conference and then phone your travel agent to get tickets, as opposed to first phoning your travel agent (without knowing the date) and then consulting your on-line date book to find the date. In real life, order matters. This project seeks to determine whether one's everyday-life intuition that order matters carries over to complexity theory. It also seeks to find cases where collapsing powerful classes induces collapses in their weaker cousins.

  •  This project focuses on classes of sets of low information content, such as sparse sets.

  •  This project focuses on exploring the power of quantum computation, and in particular on showing its superiority relative to classical deterministic and bounded-error probabilistic computing.