Computer Science Seminar Series

  • September 23, 2022
  • 12:30 PM - 2:30 CST
  • McCormick Lounge in Coffey Hall
  • Mohammed Abuhamad, mabuhamad@luc.edu
  • free
  • all
    Open to the public.
  • Add to calendar
  • Details

    Beginning Friday, September 23rd, the Computer Science Department will be hosting a CS Seminar Series featuring various exciting speakers covering topics related to the field (see below for thisFriday¿s speaker bios and talk abstracts). Please RSVP your attendance to our first event by next Wednesday, September 21st to enjoy a meal, social, and two talks by Dr. Yanjun Gao and Dr. Neil Klingensmith. Thank you and we¿ll see you there! -

    Schedule:
    12:30 - 01:10 PM - Lunch and Social
    01:10 - 01:50 PM - Talk by Dr. Yanjun Gao
    01:50 - 02:30 PM - Talk by Dr. Neil Klingensmith

    Speakers and Talks:


    Augmented Intelligence in Healthcare: How Can NLP Help Physicians at the Bedside?
    Talk By: Dr. Yanjun Gao

    Abstract:
    The electronic health record (EHR) contains patients' medical history, lab tests, diagnosis, and treatment plans collected by a multidisciplinary team of physicians, nurses, and support staff who attend to their care. While EHRs are intended to provide efficient care, they are still riddled with problems of information overload and poorly organized notes that overwhelm physicians and lead to burnout and, ultimately, inefficient care. Applying methods in natural language processing (NLP) to EHR data is a growing field with many potential applications in clinical decision support and augmented care. In the first part of the talk, I will examine the progress of clinical NLP over the years and describe both barriers that we have overcome as well as challenges that remain in advancing the field. The second part of the talk introduces a new suite of clinical NLP tasks addressing clinical reasoning, a critical cognitive process in medical education. I will also talk about how this new suite of tasks could perform the paradigm shift of clinical NLP from information extraction and outcome prediction to diagnostic reasoning, and ultimately to effective clinical decision support systems for physicians at the bedside care.

    Speaker Bio: Dr. Yanjun Gao is a postdoc research associate in ICU Data Science Lab, Department of Medicine in University of Wisconsin-Madison. Her current focus is developing NLP models for clinical diagnostic reasoning using electronic health records and medical knowledge base. She serves on the organizing committee of National NLP Clinical Challenges and Graph-Based Natural Language Processing, and guest editor for Journal of Biomedical Informatics. She has publications across major NLP/AI conferences and clinical informatics journals. Dr. Gao graduated from the NLP Lab in Pennsylvania State University with a PhD in Computer Science and Engineering.


    Are You Really Muted?: A Privacy Analysis of Mute Buttons in Video Conferencing Apps
    Talk by Dr. Neil Klingensmith

    Abstract:
    In the post-pandemic era, video conferencing apps (VCAs) have converted previously private spaces ¿ bedrooms, living rooms, and kitchens ¿ into semi-public extensions of the office. And for the most part, users have accepted these apps in their personal space, without much thought about the permission models that govern the use of their personal data during meetings. While access to a device's video camera is carefully controlled, little has been done to ensure the same level of privacy for accessing the microphone. In this work, we ask the question: what happens to the microphone data when a user clicks the mute button in a VCA? We first conduct a user study to analyze users' understanding of the permission model of the mute button. Then, using runtime binary analysis tools, we trace raw audio in many popular VCAs as it traverses the app from the audio driver to the network. We find fragmented policies for dealing with microphone data among VCAs ¿ some continuously monitor the microphone input during mute, and others do so periodically. One app transmits statistics of the audio to its telemetry servers while the app is muted. Using network traffic that we intercept en route to the telemetry server, we implement a proof-of-concept background activity classifier and demonstrate the feasibility of inferring the ongoing background activity during a meeting ¿ cooking, cleaning, typing, etc. We achieved 81.9% macro accuracy on identifying six common background activities using intercepted outgoing telemetry packets when a user is muted

    Speaker Bio: Dr. Neil Klingensmith is an assistant professor in the department of Computer Science at Loyola University Chicago. He earned his PhD in Electrical and Computer Engineering from the University of Wisconsin-Madison in 2019. His interests include security and privacy on mobile and IoT devices.