Concept: User interface
User-centred design (UCD) is a type of user interface design in which the needs and desires of users are taken into account at each stage of the design process for a service or product; often for software applications and websites. Its goal is to facilitate the design of software that is both useful and easy to use. To achieve this, you must characterise users' requirements, design suitable interactions to meet their needs, and test your designs using prototypes and real life scenarios.For bioinformatics, there is little practical information available regarding how to carry out UCD in practice. To address this we describe a complete, multi-stage UCD process used for creating a new bioinformatics resource for integrating enzyme information, called the Enzyme Portal (http://www.ebi.ac.uk/enzymeportal). This freely-available service mines and displays data about proteins with enzymatic activity from public repositories via a single search, and includes biochemical reactions, biological pathways, small molecule chemistry, disease information, 3D protein structures and relevant scientific literature.We employed several UCD techniques, including: persona development, interviews, ‘canvas sort’ card sorting, user workflows, usability testing and others. Our hope is that this case study will motivate the reader to apply similar UCD approaches to their own software design for bioinformatics. Indeed, we found the benefits included more effective decision-making for design ideas and technologies; enhanced team-working and communication; cost effectiveness; and ultimately a service that more closely meets the needs of our target audience.
This paper tackles the design of a graphical user interface (GUI) based on Matlab (MathWorks Inc., MA), a worldwide standard in the processing of biosignals, which allows the acquisition of muscular force signals and images from a ultrasound scanner simultaneously. Thus, it is possible to unify two key magnitudes for analyzing the evolution of muscular injuries: the force exerted by the muscle and section/length of the muscle when such force is exerted. This paper describes the modules developed to finally show its applicability with a case study to analyze the functioning capacity of the shoulder rotator cuff.
The Rosetta Molecular Modeling suite is a command-line-only collection of applications that enable high-resolution modeling and design of proteins and other molecules. Although extremely useful, Rosetta can be difficult to learn for scientists with little computational or programming experience. To that end, we have created a Graphical User Interface (GUI) for Rosetta, called the PyRosetta Toolkit, for creating and running protocols in Rosetta for common molecular modeling and protein design tasks and for analyzing the results of Rosetta calculations. The program is highly extensible so that developers can add new protocols and analysis tools to the PyRosetta Toolkit GUI.
The paper presents a multifunctional joint sensor with measurement adaptability for biological engineering applications, such as gait analysis, gesture recognition, etc. The adaptability is embodied in both static and dynamic environment measurements, both of body pose and in motion capture. Its multifunctional capabilities lay in its ability of simultaneous measurement of multiple degrees of freedom (MDOF) with a single sensor to reduce system complexity. The basic working mode enables 2DOF spatial angle measurement over big ranges and stands out for its applications on different joints of different individuals without recalibration. The optional advanced working mode enables an additional DOF measurement for various applications. By employing corrugated tube as the main body, the sensor is also characterized as flexible and wearable with less restraints. MDOF variations are converted to linear displacements of the sensing elements. The simple reconstruction algorithm and small outputs volume are capable of providing real-time angles and long-term monitoring. The performance assessment of the built prototype is promising enough to indicate the feasibility of the sensor.
Presumptive identification of different Enterobacteriaceae species is routinely achieved based on the biochemical properties. Traditional practice includes manual comparison of each biochemical property of the unknown sample with known reference samples and inference of its identity based on the maximum similarity pattern with the known samples. This process is labor-intensive, time-consuming, error-prone, and subjective. Therefore, automation of sorting and similarity calculation would be advantageous. Here we present a MATLAB-based graphical user interface (GUI) tool named BioCluster. This tool was designed for automated clustering and identification of Enterobacteriaceae based on biochemical test results. In this tool, we used two types of algorithms, i.e., traditional hierarchical clustering (HC) and the Improved Hierarchical Clustering (IHC), a modified algorithm that was developed specifically for the clustering and identification of Enterobacterioceae species. IHC takes into account the variability in result of 1-47 biochemical tests within this Enterobacterioceae family. This tool also provides different options to optimize the clustering in a user-friendly way. Using computer-generated synthetic data and some real data, we have demonstrated that BioCluster has high accuracy in clustering and identifying enterobacterial species based on biochemical test data. This tool can be freely downloaded at http://microbialgen.du.ac.bd/biocluster/.
A computational toolkit (spektr 3.0) has been developed to calculate x-ray spectra based on the tungsten anode spectral model using interpolating cubic splines (TASMICS) algorithm, updating previous work based on the tungsten anode spectral model using interpolating polynomials (TASMIP) spectral model. The toolkit includes a matlab (The Mathworks, Natick, MA) function library and improved user interface (UI) along with an optimization algorithm to match calculated beam quality with measurements.
Citizen science enables volunteers to contribute to scientific projects, where massive data collection and analysis are often required. Volunteers participate in citizen science activities online from their homes or in the field and are motivated by both intrinsic and extrinsic factors. Here, we investigated the possibility of integrating citizen science tasks within physical exercises envisaged as part of a potential rehabilitation therapy session. The citizen science activity entailed environmental mapping of a polluted body of water using a miniature instrumented boat, which was remotely controlled by the participants through their physical gesture tracked by a low-cost markerless motion capture system. Our findings demonstrate that the natural user interface offers an engaging and effective means for performing environmental monitoring tasks. At the same time, the citizen science activity increases the commitment of the participants, leading to a better motion performance, quantified through an array of objective indices. The study constitutes a first and necessary step toward rehabilitative treatments of the upper limb through citizen science and low-cost markerless optical systems.
The increasing interest in developing nanodevices for biophysical and biomedical applications results in concerns about thermal management at interfaces between tissues and electronic devices. However, there is neither sufficient knowledge nor suitable tools for the characterization of thermal properties at interfaces between materials of contrasting mechanics, which are essential for design with reliability. Here we use computational simulations to quantify thermal transfer across the cell membrane-graphene interface. We find that the intercalated water displays a layered order below a critical value of ∼1 nm nanoconfinement, mediating the interfacial thermal coupling, and efficiently enhancing the thermal dissipation. We thereafter develop an analytical model to evaluate the critical value for power generation in graphene before significant heat is accumulated to disturb living tissues. These findings may provide a basis for the rational design of wearable and implantable nanodevices in biosensing and thermotherapic treatments where thermal dissipation and transport processes are crucial.
A big challenge in current systems biology research arises when different types of data must be accessed from separate sources and visualized using separate tools. The high cognitive load required to navigate such a workflow is detrimental to hypothesis generation. Accordingly, there is a need for a robust research platform that incorporates all data, and provides integrated search, analysis, and visualization features through a single portal. Here, we present ePlant (http://bar.utoronto.ca/eplant), a visual analytic tool for exploring multiple levels of Arabidopsis data through a zoomable user interface. ePlant connects to several publicly available web services to download genome, proteome, interactome, transcriptome, and 3D molecular structure data for one or more genes or gene products of interest. Data are displayed with a set of visualization tools that are presented using a conceptual hierarchy from big to small, and many of the tools combine information from more than one data type. We describe the development of ePlant in this paper and present several examples illustrating its integrative features for hypothesis generation. We also describe the process of deploying ePlant as an “app” on Araport. Building on readily available web services, the code for ePlant is freely available for any other biological species research.
Hand-gesture-based sterile interface for the operating room using contextual cues for the navigation of radiological images
- Journal of the American Medical Informatics Association : JAMIA
- Published almost 8 years ago
This paper presents a method to improve the navigation and manipulation of radiological images through a sterile hand gesture recognition interface based on attentional contextual cues. Computer vision algorithms were developed to extract intention and attention cues from the surgeon’s behavior and combine them with sensory data from a commodity depth camera. The developed interface was tested in a usability experiment to assess the effectiveness of the new interface. An image navigation and manipulation task was performed, and the gesture recognition accuracy, false positives and task completion times were computed to evaluate system performance. Experimental results show that gesture interaction and surgeon behavior analysis can be used to accurately navigate, manipulate and access MRI images, and therefore this modality could replace the use of keyboard and mice-based interfaces.