Skip to main content

BioSense

This Webinar will help you learn about new and upcoming BioSense 2.0 features and how they can enhance your user experience. Nabarun Dasgupta of the BioSense Redesign Team will review new BioSense functionalities and services as well as discuss how binning occurs. He will also preview analytic tools that are being developed. Current BioSense 2.0 users who are familiar with BioSense 2.0 tools and services will find this Webinar instructive, and prospective users will gain a deeper understanding of BioSense 2.0 functionalities.

Presenters

Presenters Caleb Wiedeman and Harold Gil will describe some of the processes their organizations use to ensure the quality of data in BioSense v2.0. First, Caleb Wiedeman will review his normal routine of verifying data quality, including checking the front end of the BioSense v2.0 application for aberrations and drops in visit counts, linking front-end data to the back end using R, and using phpMyAdmin to check daily files. This portion of the presentation will focus on Tennessee’s system and the data they receive from six facilities within a single health system. 



Description

The NBIC integrates, analyzes, and distributes key information about health and disease events to help ensure the nation’s responses are well-informed, save lives, and minimize economic impact. NBIC serves as a bridge between Federal, State, Local, Territorial, and Tribal entities to conduct biosurveillance across human, animal, plant, and environmental domains. The integration of information enables early warning and shared situational awareness of biological events to inform critical decisions directing response and recovery efforts.

To meet its mission objectives, NBIC utilizes a variety of data sets, including open source information, to provide comprehensive coverage of biological events occurring across the globe. NBIC Biofeeds is a digital tool designed to improve the efficiency of reviewing and analyzing large volumes of open source reporting by biosurveillance analysts on a daily basis; moreover, the system provides a mechanism to disseminate tailored feeds allowing NBIC to better meet the specific information needs of individual, interagency partners. The tool is currently under development by the Department of Energy (DOE), Pacific Northwest National Laboratory (PNNL) and it is in a testing and evaluation phase supported by NBIC biosurveillance subject matter experts. Integration with the Defense Threat Reduction Agency (DTRA), Biosurveillance Ecosystem (BSVE) is also underway. NBIC Biofeeds Version 1 is expected to be fully operational in Fiscal Year 2017. 

Objective

The National Biosurveillance Integration Center (NBIC) is developing a scalable, flexible open source data collection, analysis, and dissemination tool to support biosurveillance operations by the U.S. Department of Homeland Security (DHS) and its federal interagency partners. 

Submitted by Magou on
Description

Once a facility meets data quality standards and is approved for production, an assumption is made that the quality of data received remains at the same level. When looking at production data quality reports from various states generated using a SAS data quality program, a need for production data quality assessment was identified. By implementing a periodic data quality update on all production facilities, data quality has improved for production data as a whole and for individual facility data. Through this activity several root causes of data quality degradation have been identified, allowing processes to be implemented in order to mitigate impact on data quality. 

Objective

To explore the quality of data submitted once a facility is moved into an ongoing submission status and address the importance of continuing data quality assessments. 

 

Submitted by Magou on
Description

Hurricane ‘Superstorm’ Sandy struck New Jersey on October 29, 2012, causing harm to the health of New Jersey residents and billions of dollars of damage to businesses, transportation, and infrastructure. Monitoring health outcomes for increased illness and injury due to a severe weather event is important in measuring the severity of conditions and the efficacy of state response, as well as in emergency response preparations for future severe weather events. Following the experience with Hurricane Sandy, NJDOH initiated a project to develop a suite of 19 indicators, known as the Severe Weather Classifier (SWC) in EpiCenter, an online system which collects emergency department chief complaint data in real-time, to perform syndromic surveillance of extreme weather–related conditions. NJDOH has since used these classifiers in more recent events to monitor for weather-related visits to storm-affected area emergency departments (ED’s).

In June, 2015, a squall line of damaging thunderstorms, known as a “bow echo,” caused downed wires and multi-day power outages in Camden and Gloucester counties in southern New Jersey. Almost exactly seven months later, in January, 2016, Winter Storm Jonas dropped more than a foot of snow over New Jersey. These events provided an opportunity to assess the indicators within SWC. 

Objective

To report the results of the application of New Jersey’s Severe Weather Classifier in New Jersey’s syndromic surveillance system during two extreme weather events. 

Submitted by Magou on

The NSSP Support Team will present an overview on the plans for a future Master Facility Table user interface (UI) designed to replace the current excel-spreadsheet update process. The presentation will feature mock-ups of the UI screens, descriptions of the proposed functionality, and allow time for Q&A. The future UI is proposed to:

·         Be supported through the AMC and allow sites to view, add, and edit facility information,

Description

Traditionally, public health surveillance departments collect, analyze, interpret, and package information into static surveillance reports for distribution to stakeholders. This resource-intensive production and dissemination process has major shortcomings that impede end users from optimally utilizing this information for public health action. Often, by the time traditional reports are ready for dissemination they are outdated. Information can be difficult to find in long static reports and there is no capability to interact with the data by users. Instead, ad hoc data requests are made, resulting in inefficiencies and delays.

Use of electronic dashboards for surveillance reporting is not new. Many public health departments have worked with information technology (IT) contractors to develop such technically sophisticated products requiring IT expertise. The technology and tools now exist to equip the public health workforce to develop in-house surveillance dashboards, which allow for unprecedented speed, flexibility, and cost savings while meeting the needs of stakeholders. At Alberta Health Services (AHS), in-house, end-to-end dashboard development infrastructure has been established that provides epidemiologists and data analysts full capabilities for effective and timely reporting of surveillance information. 

Objective

To address the limitations of traditional static surveillance reporting by developing in-house infrastructure to create and maintain interactive surveillance dashboards. 

Submitted by Magou on