Locations: CT-Hartford, MA-Wellesley, NY-New York
Intelletec has partnered in helping the Nation's Premier Health Innovators in helping people on their path to better health. They are building a new health care model that is easier to use, less expensive, and puts the consumer at the center of their care!!
- Analyze science, engineering, business and other data processing problems to implement and improve computer systems.
- Perform advanced SQL and SAS Programming in data warehouse environment including use of different analysis techniques or business intelligence application development.
- Manage, build, enhance, architect and maintain the Medicare Stars Data Engine.
- Process large data sets through complex ETL to our Medicare Star Data Engine.
- Explore, examine, and interpret large volumes of data in various forms.
- Perform analysis of structured and unstructured data to solve moderately complex business problems.
- End to end testing of the Medicare Stars data warehouse data and implement appropriate validation controls to control data quality.
- Develop automated data quality scripts to ensure the data integrity and completeness of the data.
- Design and conduct analysis and outcome studies using healthcare claims, pharmacy and lab data, employing appropriate research designs and statistical methods.
- Develop, validate and execute algorithms, data models and reporting tools that answer applied research and business questions for internal and external clients.
- Define and deliver analytical solutions in support of healthcare business initiatives that drive short and long-term objectives.
- Master’s degree in Computer Science, Computer Engineering, or Information Technology. Will accept combination of degrees equivalent to a Master’s.
- Minimum of three (3) years of experience working in Data Warehouse projects.
- Experience must include leveraging multiple tools and programming languages to analyze and manipulate data sets from disparate data sources; building data transformation and processing solutions, and working with large-scale search applications and building high volume data pipelines.
- Must also have experience with the following: SAS Enterprise Guide, Advance SAS, SAS MACRO, SAS SQL, SAS STAT, Secure FX, bash shell scripts, UNIX utilities, UNIX Commands, IBM DB2, MS SQL, PUTTY, Hadoop architecture, Netezza and Teradata. Mainframe – JCL, REXX, VSAM, XML, EASYTRIEVE, MVS, z/OS, TSO, ISPF, File-aid, Stored Procedures, ENDEVOR, ZEKE Scheduler, XPEDITER, SYNCSORT, IDCAMS.