Jitter is an online news-based real-time interface for searching Twitter. This search interface allows the users to search over Twitter’s 1% public sample in near real-time. Learning to rank methods are used to learn a time-aware ranking method to rank the top results. Results can be improved further by expanding the original query with additional terms extracted from News posted on Twitter, which allows the system to find additional terms associated with the original query to find more “interesting” posts.
NovaMedSearch, a medical search engine that integrates the two search modalities: text and image. Our goal is to provide an intuitive and simplified way of supporting multimodal queries in medical search.
Users can upload his own images to build their query or use existing sample images in their queries. The results are displayed in an ranked list with basic information (e.g. title, keywords, images (if available)) and a link to the corresponding article details.
The interface takes into account both the relevancy of the images and text similarity.
This is a game to play with facial expressions without any remote control or gestures – the best actor wins the game! The game dynamics were tuned to increase competitiveness among players and to demonstrate a new affective computation paradigm.
With several machine learning algorithms supporting this game, it integrates an API for image indexing and retrieval. This game demonstrates real-time image analysis and features extraction for search and other applications (as this game for instance). It uploads data into a REST service to index and search images where the Searcmotions application illustrates the potential of the developed search methods.