Computer and Network Systems (CNS): Core Programs PROGRAM SOLICITATION NSF 13-581 REPLACES DOCUMENT(S): NSF 12-582 National Science Foundation Directorate for Computer & Information Science & Engineering Division of Computer. DOE's core network and network security research. FederalGrants.com opportunity listing for the Information and Intelligent Systems (IIS): Core Programs federal grant. Includes information on eligibility, deadlines, requirements, and guidelines. NSF CCF core programs (medium: Aug 1–30 small. This program seeks to develop innovative mathematical methods and fast. NSF NeTS (Networking Technology and Systems). This document has been archived and replaced by NSF 16-579. Computer and Network Systems (CNS): Core Programs Program Solicitation NSF 15-572 Replaces Document(s): NSF 14-597.Seminar Series - Jun Huan, Ph. D - Deep- Learning: Investigating feed- forward Deep Neural Networks for Modeling High Throughput Chemical Bioactivity Data. Abstract: In recent years, research in Artificial Neural Networks (ANNs) has resurged, now under the Deep- Learning umbrella, and grown extremely popular due to major breakthroughs in methodological and computing capabilities. Deep- Learning methods are part of representation- learning algorithms that attempt to extract and organize discriminative information from the data. Recently reported success of DL techniques in crowd- sourced QSARs and predictive toxicology competitionshas showcased these methods as powerful tools for drug- discovery and toxicology research. Nevertheless, reported applications of Deep Learning techniques for modeling complex bioactivity data for small molecules remain still limited. In this talk I will present our recent work on optimizing feed- forward Deep Neural Nets (DNNs) hyper- parameters and performance evaluation of these methods as compared to shallow methods. In our study 4. 8 DNNs, 2. Random Forest, 2. SVM and 6 Nai. The non- parametric Wilcoxon paired singed- rank test was employed to compare the performance of DNN to RF, SVM and NB. Overall it was found that DNNs with 2 hidden layers, 2,0. Re. LU activation function and Dropout regularization technique achieved strong classification performance across all tested datasets. Our results demonstrate that DNNs are powerful modeling techniques for modeling complex bioactivity data. Speaker’s Short Bio: Dr. Jun (Luke) Huan is a Professor in the Department of Electrical Engineering and Computer Science at the University of Kansas. He directs the Data Science and Computational Life Sciences Laboratory at KU Information and Telecommunication Technology Center (ITTC). He holds courtesy appointments at the KU Bioinformatics Center, the KU Bioengineering Program, and a visiting professorship from Glaxo. Smith. Kline plc. Huan received his Ph. D. Huan works on data science, machine learning, data mining, big data, and interdisciplinary topics including bioinformatics. He has published more than 1. Ph. Ds. Huan serves the editorial board of several international journals including the Springer Journal of Big Data, Elsevier Journal of Big Data Research, and the International Journal of Data Mining and Bioinformatics. He regularly serves the program committee of top- tier international conferences on machine learning, data mining, big data, and bioinformatics. Dr. Huan's research is recognized internationally. He was a recipient of the prestigious National Science Foundation Faculty Early Career Development Award in 2. His group won the Best Student Paper Award at the IEEE International Conference on Data Mining in 2. Best Paper Award (runner- up) at the ACM International Conference on Information and Knowledge Management in 2. His work appeared at mass media including Science Daily, R& D magazine, and Eurek. Alert (sponsored by AAAS). Huan's research was supported by NSF, NIH, Do. D, and the University of Kansas. Starting January 2. Dr. Huan serves as a Program Director in NSF at its Intelligent and Information Division in the Computer and Information Science and Engineering Directorate.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. Archives
January 2017
Categories |