Facilities

Facilities

An evening view of UF’s Century Tower and University Auditorium.

University Resources

The Research Computing systems are located in the University of Florida data center. The machine room is connected to other campus resources by the 200 gigabit per second Campus Research Network (CRN), now commonly called Science DMZ. The CRN was created with an NSF Major Research Instrumentation award in 2004 and has been maintained by the University since the end of that award. The CRN connects the HPC systems to the Florida Lambda Rail, from which the National Lambda Rail and Internet2 are accessible. The University of Florida was the first institution (April 2013) to meet all requirements become an Internet2 Innovation Platform, which implies the use of software defined networking (SDN), the implementation of a Science DMZ, and a connection at 100 Gb/s to the Internet2 backbone. An NSF CC-NIE award in 2012 funded the 100 Gb/s switch and an NSF MRI grant awarded in 2012 funded the upgrade of the CRN (Science DMZ) to 200 Gb/s. The upgrade has been operational since the winter of 2013.

High-performance computing and big-data analytics

Research Computing at University of Florida (UF) operates the HiPerGator supercomputer, a cluster-based system with a combined capacity of about 21,000 cores in multi-core servers. In November 2015, this capacity was expanded by adding 30,000 new Intel cores, bringing the total to 51,000 cores. The servers are part of an integrated InfiniBand fabric. The clusters share over 5 PetaBytes of distributed storage via the Lustre parallel file system. In addition, Research Computing houses about 2 PB of storage for the High Energy Physics collaboration of the Compact Muon Solenoid (CMS) experiment. The system includes over 100 NVIDIA GPU accelerators and 24 Intel Xeon Phi accelerators, available for experimental and production research, as well as for training and teaching.

Florida cyberinfrastructure.

Universities in the state of Florida joined forces in the Sunshine State Education & Research Computing Alliance (SSERCA) to build a robust cyberinfrastructure to share expertise and resources (http://sserca.org). The current members are Florida Atlantic University (FAU), Florida International University (FIU), Florida State University (FSU), University of Central Florida (UCF), University of Florida (UF), University of Miami (UM), and University of South Florida (USF). The affiliate institutions are Florida Agricultural and Mechanical University (FAMU), University of North Florida (UNF), and University of West Florida (UWF), Florida Polytechnic University (FPU), Florida Institute of Technology (FIT), Nova South Eastern University, and New College of Florida.
The Florida Lambda Rail (FLR) provides the underlying fiber optic network and network connectivity between these institutions and many others. The FLR backbone completed the upgrade to 100 Gbps in June 2015. The University of Florida is connected to this backbone at the full speed of 100 Gbps and has been connected at that rate to Internet2 backbone since Jan 2013.

Restricted data storage and analytics.

Research projects may involve storing and processing restricted data, including intellectual property (IP), protected health information (PHI), Controlled Unclassified Information (CUI) regulated by Health Insurance Portability and Accountability Act (HIPAA), International Trade in Arms Regulation (ITAR), Export Administration Regulation (EAR), Family Educational Rights and Privacy Act (FERPA). For such projects Research Computing supports two environments

  1. Research Shield meets the NIST 800-53 rev4 “moderate” rating for contracts that require FISMA compliance and has been operating since June 2015, and
  2. GatorVault http://www.rc.ufl.edu/resources/hardware/gatorvault/is approved for PHI, FERPA, IP, and ITAR/EAR restrictions and started operating in December 2015.

Department Resources and Network infrastructure

At the department we handle two Ubuntu 18.04 Linux, and two Windows 2016 terminal servers, available via SSH and RDP to all departmental users. Users are able to run jobs and to login to from remote locations. Faculty members have access to 2 additional dedicated servers for their access and use only.
All faculty offices are equipped with a Windows, Linux, or Mac workstation. Standard software installations include Ubuntu 18.04 or Windows 10, Java, jGRASP, many Microsoft packages from the Microsoft Development Network, Mozilla Firefox and Google Chrome.
Database servers and software includes MySQL, PostgreSQL, and Oracle. Wireless access is available throughout the CSE Building and the UF campus including student dorms, cafeterias, and other public areas.
The classrooms in the CSE building have all been provided with multimedia support and computers housed in a locked kiosk. These resources all have access to the University’s wireless network. The UF College of Engineering requires all students to possess an adequately-provisioned laptop computer ensuring easy access to all resources in the classrooms.
The bulk of the CISE’s disk storage comes from a Sun 7410 with 66TB of raw disk space.
An additional 300TB is provided by other servers.
There are about 35 servers running Red Hat Enterprise Linux 7 to provice services such as:
• Web hosting
• Database hosting—MySQL, PostgreSQL, Oracle
• Kerberos / LDAP authentication
• DNS
• DHCP
• Backups via Tivoli Storage Manager and disk based rsyncs
• Samba
• NFS
• Security-related services

CISE’s web services are supported on a Dell R720, 128GB of memory and 32 2.6 Intel Xeon CPUs. They serve department content, user content and various departmental web applications.
CISE has about 220 Linux PCs running the Ubuntu 16.04 LTS and 110 Windows 10 PCs. They serve as lab machines and workstations for students, teaching assistants, research assistants, and faculty. Of these, 34 Windows PCs and 90 Linux PCs are in public labs that are intended for general student use as well as use in lab sections of graduate and undergraduate classes.
CISE has a shared computer cluster consisting of a head node with 16 2.6 GHz AMD Opteron cores, 32GB of memory and 40TB of storage with 15 worker nodes containing dual Opterons and 16GB of memory running Redhat Enterprise Linux 7. All servers in the CISE department are connected with 1 GB Ethernet to the switching backbone. Inter-switch uplinks are transitioning from 6+ 1GB to 10 GB EtherChannel ports. The newest servers have 10 GB connections for network storage to a private 10 GB network. CISE’s Cisco hardware includes a Catalyst 6513, one Catalyst 6509E, and three Catalyst 4506s providing routing and switch capabilities to the more than 600 devices and 80 networks in the department. CISE’s external connection is via a 1 GB fiber connection to the University of Florida’s core network.
A unique printing solution allows the department to offer free printing to all students and Teaching and Research Assistants based upon a quota system.

User Studies Lab (USL)

The department of CISE has approximately 200 sq ft designated as a dedicated user studies lab. This space is available for free for projects involving CISE researchers to reserve time and space as needed. The space is set up in a split set-up in which there is an observation room separated by a one-way mirror from the
human subjects participation room. Furniture includes desks and chairs which are arrangeable and configurable as needed for each study by the researchers. All computer and recording equipment are brought into the space as needed by the researchers for their
individual studies. All equipment needed for use of the user studies lab on this project is already owned by the labs or departments of the relevant project personnel.​

Equipment & Other Resources 

The Department of Computer and Information Science and Engineering possesses the following departmental resources available for use in the proposed work.
The perceptual experiments will occur in an Industrial Acoustics (IAC) Double Walled Test Chamber that is located in Dr. McMullens laboratory (pictured above). The booth (Model 120A3) has Outside dimensions: (l) 9’0’ x (w) 9’0” x (h) 8”5” and Inside dimensions: (l) 7”0” x (w) 7’0” x (h) 6’6”. The booth has standard flooring, which requires a step up of 6” to enter the chamber. It also has an added portable aluminum fold-up ramp weighing 18 lbs that can carry up to 600 lbs. It has a non-slip 4’ long and 30” wide with 1” sides ramp for ADA compliance. The booth also contains a Standard Audiometric Window and instrument connection jack panel with USB. Lighting is built-in to the roof panels and prewired with two dual 120 volt AC outlets in the chamber. Ventilation provided via fan provided through built in roof silencer services. The air in the booth is changed once every ten minutes. Lastly, the booth is equipped with two sprinklers in the event of an emergency. The booth has a dedicated LAN drop.
The proposed research will also have access to a cluster that consists of 12 nodes, each with two dual-core Opteron processor, with roughly 100 Gigaflops of raw computing power as well as a 32core shared multiprocessor with 32GB of memory. We also have access to a Science and Engineering HPC (Tier1/Tier2) facility at University of Florida that is currently operational with a state-of-theart Infiniband interconnect donated by Cisco Corporation. The HPC Center has several clusters with 4,000 cores in dual-core, quad-core, and hex-core servers. Many of the servers are part of two distinct InfiniBand fabrics. The clusters consist of 120+ terabytes of distributed storage using the Lustre parallel file system.
The bulk of the disk storage comes from a Sun 7410 with 66TB of raw disk space. There are approximately 35 servers providing services including web, email, database (mysql, oracle and postgres), Kerberos authentication, DNS, NIS, DHCP, backups via Tivoli Storage Manager, samba, NFS, LDAP, and security related services.

Other facilities in the Computer Information Science and Engineering (CISE) department that
are available are as follows:
• A disk storage system consisting 2 Network Appliance filers, each with 1 Terabyte of raw disk-space.
• Approximately 35 servers providing services including web, email, database (Oracle and Postgres), Kerberos authentication, DNS, NIS, DHCP, backups, samba, NFS, dialup, and security related services.
• 5 CPU servers are publicly available for researchers for memory intensive simulations. They range from 2GB to 8GB of memory and 300 MHz to 1 GHz ultra sparc processors.
• Approximately 120 suns (Solaris), 175 PCs running windows variants (Windows 2000 and Windows XP primarily), and 55 PCs running Linux (Debian) serving as lab machines and desktops for students, TAs, RAs, and faculty members. Of these, 24 suns, 52 windows PCs, and 45 Linux PCs are in public labs that are available for general student use, computer lab
use for classes, etc.
• A Cisco Catalyst 6513 which provides routing and switch capabilities to more than 600 machines and 80 networks in the department. The entire network consists of fiber optics to the entire department except for the machine room, which is connected to the Cisco with Gb rated UTP. All networks run at 100 MB except for the servers, which use high bandwidth
that run at 1 GB.
• A wireless network that covers the entire building for faculty and students who have wireless notebooks and other devices.