General Poster Session
Computation based on genomic-data is becoming increasingly popular today. Genomic data is used in medical domain and other applications. Genomic-data is highly sensitive; it discloses information about the data owner, as well as the owner’s relatives. Therefore, protection of genomic-data is essential. Towards this goal we developed general secure computation techniques for malicious and semi-honest users based on garbled circuit evaluation; they are applicable for ancestry, paternity, and genomic-compatibility testing.
Streaming data anonymity schemes based on k-anonymity offer fast privacy preservation but are reliant on buffering schemes that incur high information loss rates on intermittent data streams. We propose a scheme for adjusting the size of the buffer based on data arrival rates and use k-anonymity to enforce data privacy. We incur an average information loss in delay of 1.95% compared to other solutions that average a loss of 12.7%.
People consider Facebook as the social media, where, they share even trifling matters of their life. While Facebook enables socialization, networking, and relationship maintenance, it also impelling people to face harassment directly or indirectly. In this research, we are trying to find out the reasons of harassment faced by the Facebook user in the perspective of Bangladesh and proposing solutions to alleviate those reasons.
This project developed a sensor-based approach to obtain quantitative measurements of gait patterns for people with and without glaucoma. The acceleration signals were collected during the clinical gait experiments, and the statistical gait features were extracted to train various learning algorithms to distinguish glaucoma from normal gait patterns. The results demonstrated our wearable sensing system can be used to obtain more than 90% accuracy in recognizing glaucoma patients.
Smart vehicles are considered major providers of ubiquitous information services. We propose the caching-assisted data delivery (CADD) scheme as a solution for expedited, cost-effective access to vehicular sensing resources. CADD relies on the deployment of a "light-weight" road caching spot (RCS) at intersections, and vehicles as data carriers. Performance evaluation of CADD shows significant improvements in the access cost and delay compared to another scheme that does not deploy RCSs.
We propose a robust framework for detecting siblings from a pair of images, based upon how closely one image’s feature set matches that of another. In calculating similarity for a given pair, our algorithm predicts a sibling pair only when matched-feature vectors are above a defined similarity metric threshold . We also develop a novel classification strategy that fuses a genetic algorithm and a support vector machine to identify siblings.
In this work we introduce Insistent Spectrum Sensing Data Falsification as a new attack model for distributed cooperative spectrum sensing schemes that are based on iterative average consensus. We compare various schemes in terms of detection performance under this attack. Moreover, we propose a trust management scheme to mitigate the attack. Finally, we quantify the performance improvement due to trust management through extensive simulations.
A program sought to optimize its recruitment process that currently consists of an application and two interviews. With a subset of admission data, we built two support vector machines and external data suggestions. Our first SVM predicted which applicants should skip the phone interview; our second predicted the application outcome. Overall interview rate was reduced from 57.5% to 35.5%, while maintaining a prediction accuracy of above 80%.
This work presents a method to quantify the differences in data using a newly defined distance metric between graphs that is used to study, analyze and validate a statistical predictive model. The proposed method constructs smoothed Reeb graphs of a given graph, which is then used to compare different outcomes of the statistical model. We hope to use this method to study and analyze spread of AIDS in a community.
We have developed a personalized computer game for children with ASD, where they can create their own Customizable Virtual Teacher by integrating various facial features and wearable accessories. The Virtual Teacher Character has the potential to be deployed in a virtual classroom. The findings of the proposed research will enrich the research area of technology enhanced special education, by developing a new approach to virtual classroom for children with ASD.
Flow statistics are used to discover anomalies by aggregating network traces and then using machine-learning classifiers to discover suspicious activities. However, the efficiency and effectiveness of the flow classification models depends on the granularity of aggregation. This paper describes a novel approach that aggregates packets into network flows and correlates them with security events generated by payload-based IDSs for detection of cyber-attacks.
In recent years, data collection has far outpaced the tools for making sense of large amounts of unstructured data, such as text. Additionally, presenting the data in a way that is meaningful to the end user poses another challenge. To address these problems, we developed an interactive geovisualization tool: Global Pattern Search at Scale (GPSS). In this poster, we present GPSS and show how it can reveal patterns to analysts.
Vehicular Ad-Hoc Networks (VANETs) have emerged as a novel technology for revolutionizing the driving experiences of human. The received message in VANETs may contain some malicious content that can affect the network. Hence, security is a major consideration before the deployment of such networks. A privacy preserving authentication framework is proposed which employs the use of bloom filters to reduce Certificateless Aggregate Signature(CLAS) Verification time in vehicular communications.
We present a novel way to visualize and cluster massive multivariate time series based on temporal association rules. We obtain a bitmap icon representation of symbolic time interval series, which reveals the pairwise temporal association relations between dimensions of a data object based on Allen’s relations. The relations can be effectively visualized in a constant size small icon, independent of the length of the series.
Cheating behavior, as a controversial behavior, in online gaming social networks has been prevalent since the birth of online games. In this work, we empirically study the adoption of cheating behavior and show that the adoption of cheating behavior is a contagion that the probability of adoption depends on the number of cheaters friends in users’ neighborhoods. Also, repeated exposure to cheater neighbors increases the contagion probability.
Enormous increase in multimedia content causes need of effective retrieval system. Key frame extraction plays vital role in these systems. Many existing key frame extraction methods are computationally expensive or fail to capture major visual content. We propose similarity based approach to extract key frames which is computationally efficient and effective. We compute similarity between two frames using correlation which determines the potential key frames.
After studying Unmanned Aerial Vehicles and Systems (UAVs and UASs respectively) technology for several years -- seeking the input of industry experts regarding best practices, and safety protocols and procedures. USAA is researching how to effectively leverage and develop technology for catastrophe situations. USAA wants to begin testing small, unmanned aircraft systems that can record data over areas that have been damaged as the result of a natural disaster.
Young Women in Computing (YWiC) is an outreach initiative at New Mexico State University (NMSU) designed to spark female student interest in computer science. By enhancing interest in computing among key student populations, YWiC has successfully created a pipeline into C-STEM undergraduate programs. This poster will present insights, challenges, and quality practices that YWiC has developed to become a successful and sustainable outreach program in Southern New Mexico.
In this poster, we test whether going into a classroom setting is the best way to increase student interest in Computer Science or bringing them to a University. We conducted a 2-day web development activity at our local high school and an afternoon at our university for students to begin to learn to code. Our results show that an invited event yields higher student interest in studying CS further.
The power of an engaging science or technology demonstration is that it can serve as an on-ramp to many Science Technologies Engineering and Mathematics (STEM) fields. Such demos should be interactive, have an affordable take-home component, and leave an opening for student-led development. In this presentation we will show an example on one such demo, the Human Supercomputer, that allows students to explore parallel computing by becoming the compute nodes.
For moderate-spatial resolution sensors like Landsat, researchers wait for days to obtain an image for a study area and it may be useless because of the clouds. For coarse-spatial resolution sensors like MODIS, the temporal frequency is much better. Wavelet-based spatiotemporal fusion model is proposed to estimate the missing Landsat data from the available Landsat and MODIS data at other dates and the available MODIS data at the prediction date.
Millimeter-wave communication achieves multi-Gb/sec data rates via highly directional links that overcome pathloss. Establishing directional communication is a high overhead procedure whose search space scales with the product of the sender-receiver beam-resolution. We design Blind Beam Steering (BBS) a novel architecture and algorithm that removes in-band overhead for directional mm-Wave link establishment. We utilize 2.4/5 GHz omni-directional transmissions to obtain the line-of-sight path with angle-of-arrival analysis for directional mm-wave communication.
We develop an anomaly detection and decision support system based on data collected through the Street Bump smartphone application. The system effectively classifies roadway obstacles into predefined categories using machine learning algorithms, as well as identifies a ranked list of actionable ones requiring immediate attention based on a proposed anomaly index. Results on an actual dataset provided by the City of Boston illustrate the effectiveness of our system in practice.
In this paper, we aim to model knowledge representation and learning mechanisms used by infants and to transform development of reaching skill for robots by applying those mechanisms from developmental psychology studies in infants. We hope that by mimicking the known learning process in infants, the autonomous robots will be more intelligent, adaptable, and useful than traditional robots.
This study break down privacy policies of two well-known social network and financial institutions into simple segments. Crowd sourcing used to analyze consumers’ response to these policy segments and know users’ awareness, expectations, familiarity, and privacy concerns of these segments. The relationships between various factors like demographic factors, data type, data flow and consumers’ privacy concerns also investigated. Then suggestions for improvements and increasing users’ awareness of privacy policies presented.
Real-world machine learning problems are numerous. Selecting the right machine learning algorithm is critical for state of art performance. Developing an automated command line machine learning pipeline has shown to be advantageous for the training and parameter weight selection for each learning algorithm. Generated graphs and plots help to adjust parameter weights manually when domain knowledge is present. Automation saves time and improves accuracy.
We take a novel approach to zero pronoun resolution in Chinese: our model explicitly tracks the flow of focus in a discourse. Our approach is not reliant on the presence of overt noun phrase antecedents to resolve to, and allows us to address the large percentage of "non-anaphoric" pronouns filtered out in other approaches. We train our model using readily available parallel Chinese/English corpora, allowing for training without hand-annotated data.
Image segmentation is one of the significant problems in computer vision and pattern recognition which is a process of separating a specific digital object from its background.This process is for better representation of the multiple segments and for analyzing the segments which have similar characteristics.Our concern is the optimal automation of separating specific object from its background in digital setting using the knowledge of digital geometry.
Our work proposes an end-to-end neural network model for video description that takes in video frame pixels as input and outputs sentences. In an extensive experimental evaluation, we showed that our approach, which uses Long Short Term Memory (LSTM) RNNs generates better sentences than related approaches. We also show that exploiting image description datasets improves performance compared to relying only on video description data.
Although the perception is that role models and peer mentoring improve women’s retention in computing, little work has been done on the success of models of support for women learning to code. This study addresses the question "what influence does participating in a small, virtual group of female peers and a more-knowledgeable coach have on the persistence and confidence of female end user programmers who are teaching themselves to code?"
We are proposing an Intelligent intrusion detection system that can counteract DoS and DDoS attacks in cloud environment since cloud services are vulnerable to DoS attacks. The proposed model uses a hybrid intrusion detection system that can leverage host based and network based IDSs and uses a cooperative module to communicate DoS attacks.
The goal of this research work is to provide users with personalized suggestions and help them decide which items are best tailored towards their individual interests. This is accomplished by exploiting users' personal reviews to identify items' features appealing to a user. Each generated recommendation is paired with an explanation that showcases why a particular item was selected, based on reviewers’ opinions on features of interest to a user.
Latency in a Virtual Environment is known to disrupt user experience. However, can we utilize the effects caused by experiencing latency to benefit virtual rehabilitation technologies? We investigate this question by conducting an experiment that is aimed at altering gait by introducing latency applied to one side of a self-avatar with a front-facing mirror. The results indicate a potential of using one-sided latency to develop asymmetric gait rehabilitation techniques.
Mobile reading is becoming popular in developing countries with the introduction of affordable mobile phones. The Co-reading of eTextbooks has surprisingly received little attention yet it common with paper books. A study was conducted with students from the University of Cape Town to identify an initial set of user requirements for designing interactive eTextbooks. The study reveals insights into user needs and the potential benefits of digital media for co-reading.
By considering how browsing content relates to psychological constructs , and how an individual's browsing behaviour deviates over time, potential insider-threats could be uncovered before significant damage is caused. We present an initial approach to: Map website keywords to OCEAN personality traits; and using the keyword-personality mapping create a browsing profile that could be monitored to detect browsing deviations and consequently using these deviations to uncover changes in personality traits.
There is an increasing trend where companies interact with their users using various social media platforms. Online social media platforms provide a feedback mechanism for content providers to know how content is being shared across their follower groups. In this study, we present a framework to analyze the engagement of the user in the content using a topic model approach. We also apply the same approach to find potential followers.
Teenage pregnancy can be reduced by educating girls about sex education and the outcomes of early pregnancy at school or home before the age of 14, when young people become sexually active. Computerized-based awareness is a good tool and avenue for this cause. An informational Tablet-like toolkit can provide pregnancy related information and basic computer skills - equipping teenage girls with sex education and broadening their participation in Computer Science.
Modern scientific simulations divide work between parallel processors by decomposing a spatial domain of elements; a balanced assignment is critical for parallel performance. Typical SPMD simulations wait while a load balance algorithm determines how to evenly redistributes work to processes. We make the load balance algorithm affordable by decoupling it from the application and running the load balance algorithm concurrently with the application on a smaller number of processors.
A method level offloading mechanism has been proposed where no prior image of the mobile device is needed to be transferred to the cloud. The application is partitioned at different points where the execution thread migration is performed to nearby resourceful cloud to get the best execution performance in optimal energy cost. This mechanism increases scalability plus performance in the form of faster execution speed of the mobile devices.
Lexos is an open source Python/Flask-based text analysis workflow, mainly used for English researchers to analyze digitized texts. However, as a native Mandarin speaker, analyzing non-western languages is of interest. In order to conduct the testing by using eight modern Chinese texts, tokenization, hierarchical clustering analysis and silhouette scores were implemented to the GUI of Lexos. The result of the testing demonstrated the relationship of input files.
With the drive to exascale the demand for power in a High Performance Computing (HPC) world is increasing. This work describes the process of developing a methodology for reducing power consumption in a data center to both improve efficiency of the current data center as well as provide more power to future compute. The development of this methodology enabled similar application to additional LANL data centers.
Designed protein crystals have great potential as structure determination tools and as biomaterials for energy, environmental, and health-related applications. The general rules for protein crystallization can be broken down into chemical, physical and geometric constraints. Incorporating these parameters into tools for the accurate structural modeling of macromolecules, we propose to develop an object-oriented computational method capable of yielding crystal structures constructed from complex protein scaffolds with atomic-level accuracy.
Recent efforts indicate that neurological-like somatic cell memory may play a foundational role in 3-dimensional pattern formation in development and regeneration, yet there are no comprehensive network models exploring complex learning and memory in the context of network remodeling. We evaluate the robustness of artificial neural network memory under several perturbation methods as a first step toward understanding information storage in cellular networks during embryogenesis and large-scale regeneration.
Click-Through-Rate (CTR) is traditionally used to measure success of advertising campaigns. Quixey used it to measure overall search user engagement. We experienced limitations of traditional CTR and implemented its Normalized variation. We enhanced it to be used in user behavior datasets where search and clicks are not correlated. This proved to be a better indicator of user engagement in search and was acknowledged as a superior solution to Alibaba’s metric.
In present world, finding a perfect match between job seekers and job posters is difficult. There is an emerging need among the candidates and the companies for an automated system which can resolve this. "Personalized Job Matching" is a possible solution. We have developed a model that receives resumes from candidates and requirements from companies, builds up index using advanced natural language processing and retrieves related jobs or potential candidates.
Current alternatives to human-instructed music theory courses provide limited feedback to students, lack a writing modality, or assume that the student already has a proficient grasp on music theory. To circumvent this dilemma, we present Maestoso: a smart, pen-based, sketching tool for learning music theory. Maestoso automatically recognizes user input and generates instructor-emulated feedback. Results show that novice students can comfortably grasp introductory music theory in a single Maestoso session.
Against a backdrop of increasing undergraduate computer science enrollments, and despite the proliferation of programs to broaden participation in CS, the percentage of women awarded CS bachelor degrees has essentially stagnated since 2007. Why do we not see significantly more women receiving CS bachelors? We propose a model of enrollment pressures forcing out female students, as increasing enrollments compound the effects of existing contributors to low female computer science participation.
We propose a formal approach for linking continuous time models and discrete time models. A direct semantics for a network of timed automata with a discrete-time component is introduced. We provide a translation of the discrete-time controller into timed automata proving its correctness. The usefulness of the approach is shown by proving that a discrete-time medical infusion pump is safe with respect to a continuous time clinical scenario.
As part of a National Science Foundation grant aimed at female involvement and achievement in STEM, we developed an educational computer science game entitled “Ice Maze” to help teach introductory computer science concepts and increase interest in the field. The goal of the research was to examine how growth mindsets can help buffer against effects of identity threat on sense of belonging and performance in the field of computer science.
Development of a novel molecule that is a candidate to treat a condition can take years of research and cost billions of dollars. Drug Re-positioning is the practice of taking a drug used to treat an existing condition and re-purposing it to see if its safe and effective in treating another condition. This project uses a computational biology approach of leveraging disease-disease similarity to propose potential drug candidates for re-positioning.
We present a novel visual analytics platform for interactive exploration of time-varying medical data. Our framework is targeted at visualizing the spleen and observing changes in its shape and size over time. The system is comprised of multiple linked views, and allows for fast comparison between several scans of a single patient as well as similarity queries for other subjects in both sick and healthy groups.
Due to the increasing numbers apps on app repositories, there are often multiple apps with similar features which may present different energy costs, according to different design choices. Given apps with similar features, users prefer an app with the least energy cost. However, App repositories are lacking information about energy cost of apps. We have developed an approach that ranks apps from the same category based on their energy consumption.
A brine-Ch4 model was added to PFLOTRAN, a massively parallel subsurface reservoir modeling code, to enable flow simulation in shale formation. The code accuracy and parallel performance were compared to TOUGH2, the state-of-the-art reservoir modeling code currently used by DoE. PFLOTRAN was found to be as accurate and one order of magnitude faster, which opens new possibilities in risk assessment for hydraulic fracturing and oil applications.
This research looks at the correlation between the mindset of a student and their completion of a Massive Open Online Course (MOOC). More specifically it looks at mindset using Carol Dweck's fixed intelligence and growth mindsets, predicting that those who display traits of a fixed intelligence mind set in their forum posts are more likely to have failed to complete the course than those who did not.
A trip planning query(TPQ) returns a set of data points using given source-destination pair and the type of facilities (e.g., restaurant, market) that minimize the total travel distance of the trip. Due to privacy concern, users may not wish to disclose their exact source-destination locations. In this paper, we develop the first privacy preserving approach to evaluate TPQ without disclosing a user's actual source-destination locations to the location-based service provider.
Verifying a software system for production readiness usually requires a meticulous process of ensuring all requirements have been completely integrated. The Verification Cross-Reference Matrix provides a comprehensive method to verify requirements are complete and concise during design, trace the requirements through development, and ensure all requirements have been accounted for prior to going into production, effectively ensuring that no requirement has been left behind.
The low power wearable devices are prone to safety and sustainability issues. Thus, I propose a novel non-linear optimization framework to find the optimal safe and sustainable design for such devices for a defined time, hardware and software constraints. It simulates continuous dynamics of the human physiology to consider the interaction effect between human body and wearable sensors.
The Rosetta Membrane method (RMM) is developed to predict and design the three-dimensional structures of proteins from their primary sequence using de novo and homology modeling approaches. It predicts amino acid packing in membrane proteins using a statistically derived energy function based on known transmembrane protein (TMP) structures. I aim to improve the accuracy of TMP prediction by refining the environment, pair and density terms in the RMM energy function.
Humans are sometimes capable of detecting spelling errors in words they have never encountered. Apparently, this is performing spellchecking of words without a known list of spellings based on some intelligent behavior. Once this human behavior can be modeled as input features, then it is possible to use machine learning algorithms to teach a computer to achieve spellchecking in the absence of a wordlist.
This poster describes a mathematical model for a brain-inspired computational framework called Hierarchical Temporal Memory (HTM). This model considers how HTM, like our brains, can both memorize information and make generalizations. By examining the numerical parameters required to ensure similar inputs have similar representations and different inputs have different representations, we can begin to understand how HTM learns to recognize inputs.
Role-Based Access Control (RBAC) is widely used for fine-grained access control. Due to its complexity including role management, role hierarchy with hundreds of roles, and their associated privileges and users, systematically testing RBAC systems is crucial to ensure security. We introduce a RBAC security testing technique using a MTBDD (Multi-Terminal Binary Decision Diagram) based representation of RBAC security policy RHMTBDD (Role Hierarchy MTBDD) to efficiently generate effective security test cases.
An increasing interest in educational video games demands research into game design which enhances cognitive abilities such as memory performance. In this paper, we tested how color variation in the game design can influence players’ emotions and whether these elements impact on players’ performance. Our study focused on the effect of, using either cool or warm colors.
Our technique preserves the privacy of the dataset by introducing a random
component to the normalization process. Because of the limited distribution of the generated random numbers and closeness of their mean to one, the
distribution of the perturbed data points does not destroy data mining accuracy and utility. It allows data providers to choose the appropriate
trade off between accuracy and privacy based on selecting the width of interval.
Informal science institutions (ISIs) are beginning to adopt mobile technology to support docents. In our context end-users are often youth docents who are emerging professionals that may not yet fully understand the task domain. We developed and applied two different framing strategies (one technological, one sociotechnological) to traditional participatory design methods to help non-expert youth docents generate task-relevant design ideas. We report results from using these strategies.
In this study we explore conversational structures within Twitter (using data mining and statistical techniques) by comparing the topic of violence/violence against women with other women-related issues. We discover far higher response depth, degree, and activity on this topic. However, the amount of broadcast/retweets is far less. These findings not only reflect previous non-online sociology studies but also provide new perspectives on women-related issues via the social media platform.
Assurance cases are structured logical arguments that can be used to show that a software system is safe. The confidence and uncertainty one has in the assurance case are important issues that must be considered. Previous techniques involved the use of a truncated normal distribution to quantify confidence. This work proposes the use of a beta distribution, based on how it ties in with the opinion triangle proposed by Josang.
The poster focuses on applying Eigenfaces Algorithm for face recognition and Multimodal Biometric System, to design an automated home security system with minimal chances of fraud.
The system captures image of person through camera, extracts its features, compares features with images in database and spoofed image set to distinguish between real and spoofed image. The result should be almost accurate because of difficulty to spoof beyond the spoofed image set.
Occupants’ evacuation is a crucial need in the case of emergency in confined places. However due to the nature of the situation, emergency management is challenging. Challenges include time management, resources prioritization, and dynamism of the environment. In this research, we simulate a multi-story building with ongoing hazard. Simulation helps us identifying the risks and bottlenecks of the building along with providing the opportunity of testing various emergency management techniques.
Currently, the paths of plush toys and the social media have rarely converged. Connection between plush toys and an online presence is an area that has yet to be developed. Even less focus has been on the emotional interaction between user and toy. As a result, we are exploring this void with Roboplush: an amorphous, cuddly robot who both responds to physical interaction and has his own Twitter personality.
This study compares and evaluates the effectiveness of continuous integration tools for application performance monitoring on high performance computing systems. In addition, a prototype system for application performance monitoring based on Jenkins is presented. The monitoring system described leverages several features available in continuous integration tools to track application performance results over time. Preliminary results from monitoring applications on the supercomputers at the Oak Ridge Leadership Computing Facility are presented.
Students perceive computer science in many ways, and these perceptions impact their decisions to continue in computer science. Significant work has been done to identify these perceptions, and recommend interventions. Rather than starting from a proposed intervention, we analyze the perceptions of students considering computer science. We believe that through targeting the perceptions our students have, we can affect change by developing interventions around the issues our students face.
Antimicrobial peptides (AMPs) are small proteins that act as a defense mechanism against microbes in organisms. AMPs can act as an alternative to antibiotics that have increasingly inefficacy due to bacterial resistance. Despite their importance, the direct antimicrobial activity is unknown. A computational approach, through the use of self-assembly simulations of novel proteins were tested with preassembled lipopolysaccharide membranes to determine the direct antimicrobial activity, roles, and functions of AMPs.
Graphs are powerful abstractions for capturing complex relationships in large-scale datasets. An active area of research focuses on theoretical models that define the graph generative mechanism. We present a framework for discriminating such models using a random forest classifier trained on topological graph features. We discover that the random forest classifier is able to discriminate the Erdos-Renyi and block model with performance results very close to known theoretical detectability bounds.
JSON in the Big O is a project that seeks to introduce high school students to the basic concepts of programming and logical thinking through a game focused on giving a fun user experience. To achieve it’s objective it follows 2 lines:
-Use of a game that teaches young people basic conceptual structures for algorithmic thinking through levels based on an state automaton.
-The Construction of a new programmer imaginary.
We study the problem of maximizing the lifetime of wireless sensor networks using routing and initial energy allocation over its nodes. In order to capture the actual behavior of nodes' batteries, we consider a general high dimensional nonlinear battery model and adopt that to the problem.
we show that adopting this dynamic battery model in a fixed-topology network, there exists an optimal policy consisting of time-invariant routing probabilities .
Project Eye-Helper is an ongoing pursuit in the Olin College Crowdsourcing and Machine Learning Lab. We aim to create a wearable device combining computer vision and crowdsourcing to help the blind shop for groceries without relying on sighted in-store personnel. Through agile development, we are rapidly producing prototypes that can be used immediately for co-design. As such, we have prototyped with conventional video cameras and Google's 3D mapping tablet.
This work provides an initial investigation into the combined effects of gender and vision impairment in relation to ICT access in Kenya. We conducted a focus group with female students at the Thika Secondary School to: 1) understand the barriers they perceive to accessing ICT; 2) social and personal goals that drive their involvement and interest in ICT; and 3) identify strategies for recruiting and retaining females in ICT training.
Cyberspace has become the new platform and form of criminal activities as reflection of real time conflicts and hatred. Recent statistics show a large number of cybercrime victims are women in most of the regions. This paper proposes a systematic and secure Cloud platform for the women having their own purposes on cyberspace which will also take the cyber criminals under acknowledgement and reduce the crime rate and overall impact.
In Model-Based Developments, engineers develop elaborate system models but sparsely document requirements. When such systems are tested, the test outcome could not be adequately verified. In our approach, we use a dynamic backward slicing technique to identify the adequacy of oracles and test cases. The novelty of our approach is the optimization of the slicing criteria. We evaluate our approach using three non-trivial case examples.
The insider threat remains one of the most serious challenges to computer security. Deception techniques have served as a common solution in insider threat detection and several techniques such as decoy techniques have been proposed. In this work, we focus on integrating deception into role-based access control (RBAC) model. We introduce the notion of honey permission and use it to extend RBAC to help in insider threat detection.
MiRNAs play essential role in gene regulation. The standard method for determining miRNA-gene relationships is perturbation experiments, which are expensive and time consuming. Therefor, computational methods are crucial. We propose miRNA-gene regulations prediction approach. Using expression and sequence data to predicted miRNA targets. We score the predicted target scored based on features set. The targets predicted for each miRNAs are compared with the experimentally validated target from miRNA databases.
Between 2006 and 2012, the New York City Police Department made roughly four million stops as part of the city's controversial stop-and-frisk program placing the social costs on many innocent individuals. First, we estimate stop rates for these demographic subgroups of the population. Second, we statistically analyze the reasons that officers state for making each stop in order to develop simple heuristics to aid officers in making better stop decisions.
Personalized medicine is an emerging medical technology that promises to tailor health care to the individual patient. In this experiment, results of signals from bioluminescent cells are detected using an integrated circuit biochip that consists of a single photon avalanche diode and converted to digital output. The integrated circuit biochips are designed for potential integration with human cells capable of emitting bioluminescent signals to indicate cell death or deleterious conditions.
The socially optimal distribution power market clearing problem with diverse, complex-utility-structure participants and non-convex load flow constraints poses computational challenges. We extend the Proximal Message Passing algorithms in the literature and demonstrate that we can obtain (i) significant computational improvements, (ii) excellent accuracy of results, both compared to solving the exact centralized problem, and (iii) significantly less communication requirements for coordination in comparison to PMP algorithms existing in the literature.
Numerous outreach activities have been developed to increase interest in computing prior to students reaching university. But are these activities effective in increasing enrollment in computing majors? Our study asked college students to recall their experiences in computing outreach programs and indicate whether those experiences influenced their choices of major in college. We found that these activities more heavily influenced males than females.
Rule composition and visualization are recurring topics in user interfaces. Traditional interfaces representing logical statements are not suited for non-technical users. A new interface solution is proposed to provide users with a more intuitive way to define logical conditions. It is applied to a home automation system and evaluated against a traditional interface, resulting on a greater improvement for non-technical than technical users.
Nowadays, users in social media are constantly struggling to keep up with the latest information of mainstream news among the large amount of available content. In this paper, we present a framework to filter out topic-specific relevant information from irrelevant information in the stream of text provided by social media platforms, and further provide sequential summary or event story line. The experiment verified the effectiveness of the proposed framework.
Moonlighting proteins are a class of proteins that show multiple cellular functions within a single polypeptide chain. Their functional diversity is often due to one or a combination of different cellular phenomena. These proteins are found to be vital in drug targets and disease development. Previously we systematically characterized moonlighting proteins in genome-scale using functional/omics-scale information. Here, we develop a machine-learning prediction model for automatic identification of moonlighting proteins.
Many college faculty are finding themselves at a crossroads on how to best deliver their traditional content to 21st century learners. The poster presents initial results from a study that investigated the use of an enhanced online learning experience in a traditionally taught CS1 course. The results revealed an increase in student performance after treatment was administered, but students were conflicted about the delivery method.
Driver distraction is one of the primary factors of accidents, followed by drunk driving and speeding. The integration of information and communication technology has led mobile phones to constitute a major part of cognitive distraction. However, we utilize the capabilities of these same phones to identify distracted behaviors by analyzing the neurological response from an individual’s brain signals. This application will address public safety concerns by providing feedback in real-time.
GenePedia is a system whose applicability heavily relies in use in health centers. It`s main purpose is to keep track of our lineage and family tree depending on who gave birth to us and who we give birth to. Hospital applicability was chosen given that this is the institution that records life and deaths. GenePedia uses an individual`s family tree information to determine genealogy, family life span, family traits.
The usability design is drawing attentions from both parties of bioinformatics software development: the users and the developers. The goal of this study was to address the key reasons that led to the current usability concerns of bioinformatics software; and from the perspective of users, identify the usability features that ease their use of software. Four in-depth interviews with frequent bioinformatics software users provided valuable insights into the above questions.
The high processing complexity of detection in large scale communication systems constraints the implementation of practical massive multiple-input multiple-output (MIMO) systems. In this paper, the parallelized matrix operations are utilized to increase the efficiency of detector. Experimental results on the Compute Unified Device Architecture (CUDA) platform show that our approach achieves significant performance improvement.
Federated SPARQL queries can gather RDF (Research Description Framework) data from several databases across a network, providing a powerful tool to aggregate data. Given a federated query, it is crucial to generate an optimized query to minimize execution time. We propose a greedy algorithm to optimize a federated SPARQL query by reordering the service calls to different Linked Data endpoints using cardinality estimates.
Achieving reliable and cost effective ubiquitous health monitoring are the primary goals of Wireless Body Area Network applications. WBAN applications for monitoring pervasive healthcare generates very high volume of heterogeneous data (text, video, images), over a short duration. Traditional RDBMS does not provide cost effective solution for such hundreds of petabytes of data. We have proposed a cloud based WBAN architecture in order to facilitate data reliability in WBAN applications.
As industry leaders seek creative engineers capable of designing solutions to the circuit problems of the future, such as the end of Moore’s Law, engineering education must incorporate more creativity and innovative design approaches. This paper proposes the use of gamification for encouraging creative thinking among students in electronic design automation. This work may also be used to find new solutions to circuit design challenges by data-mining user problem-solving techniques.
The ability to custom design proteins to target protein-protein interactions bears potential for the treatment of human diseases. Engineering proteins with unique functionalities is limited by naturally occurring scaffolds. We developed a computational platform that enables the design of therapeutic proteins from scratch. We use de novo design to disrupt interactions with TrkA as potential anti-cancer drugs. Our approach can be extended to other therapeutically relevant protein drug targets.
We attempt to derive computational analogues of existing economic and financial indicators from Big Data, namely Google search volumes and Twitter. Our results show that, surprisingly, these computational indicators are not only strongly correlated to existing official economic and financial indicators, but may in fact have predictive value.
This poster presents a smartphone application which monitors second heart sound. The heart sounds are recorded using a stethoscope with an external microphone attached to the smartphone. Next, the signal is analyzed using wavelet transforms. This helps to identify the aortic and pulmonic components of second heart sound. Finally, the time delay between the components is calculated in order to classify the signals into normal or abnormal heart sounds.
Data outsourcing is becoming increasingly popular. As the third party service provider may not be trustworthy, there should be a way to verify the result. Authentication techniques for many spatial queries are already existed for Euclidean space and road networks. However, no work has been done to authenticate queries in the presence of obstacles. We develop an approach to authenticate k nearest neighbor queries in the obstructed space.
A majority of the world's information is contained in `unstructured' data such as text or video. Search engines, which provide content-based access to this material, are developed using standard sets of realistic test cases that let researchers measure the relative effectiveness of alternative approaches. For 25 years, the Text REtrieval Conference (TREC) has been instrumental in creating necessary infrastructure to measure search effectiveness thus advancing the state of the art.
We've heard of crowdsourcing for deciphering handwriting and transcribing audio clips, but what happens when you have more complex tasks? Complex tasks are harder to evaluate, so how do we ensure high quality output? We present a framework for crowdsourcing complex tasks that tackles the problem of high quality at low cost. We’ll present techniques from creating a hierarchy of trusted workers and worker modeling to predicting task quality.
The explosion in behavioral, demographic and psychographic data, along with the availability of big data technologies, has opened up an unprecedented array of insights into customer needs and behaviors. Content personalization and behavioral targeting are two techniques that result in increased conversions. In this talk we will look at building a pipeline to capture data from various sources and use that data to enhance the user experience of product marketing.
Detection of early indicators of forced migration in a geographic area is important to alert social scientists of troubling hotspots. We integrate subject-matter expert knowledge and international news articles from disparate sources. We leverage state-of-the art Stanford NLP tools to process the text corpus within the Apache Spark cluster-computing framework. We generate concept graph and apply algorithms to measure the relatedness of concepts associated with forced migration and specific locations.