The candidate will be responsible for requirements gathering and analysis, design, development, testing and debugging of analytical, statistical/mathematical programming, optimization and decision support systems. Additionally, will work with a large amount of data from diverse, complex and frequently unrelated sources and be able to perform data analyses/mining, data cleansing, data integration and occasional mathematical/data modeling with little or no guidance.
Interested candidates can reach me at (678)-819-8520 or can mail me at xavierk@askstaffing. com
1) Requirements gathering and analysis; Architect solution and implement with Agile methodology.
2) Data acquisition, data ETL, data cleansing, data mining, data integration and some statistical analyses and programming.
3) Implement data modeling, statistical/mathematical, linear or optimization programming models.
4) Oracle databases/tables design, database development, data structure indexing, database partitioning, database performance monitoring and tuning; database backup and replication.
5) Analyzes, designs, programs, debugs and modifies software enhancements and/or new products used in local, networked, or Internet-related computer programs.
6) Setup and administer document/file transfer/depository systems such as FTP, SSH, Sharepoint drive or other similar technologies.
7) Writes code and completes programming and documentation using programming language and technology in Unix/Linus and/or web environment. Performs testing and debugging of applications.
8) Web-applications with multiple levels/dimensions of securities, inside/outside firewalls, eCommerce like with encryption or equivalent technologies.
Interested candidates can reach me at (678)-819-8520 or can mail me at xavierk@askstaffing. com
Bachelor’s or Master’s degree in computer science, engineering or statistics - At least one year experience with data analysis on large scale in hadoop eco system - Good programming skills.
Experience in at least one programming language. (Python or Java preferred); 1-2 years. - Experience in R, Pig, Hive and SQL is required; 1-2 yrs –
Must have experience in unix and command line scripting; 1-2 yers - Experience in the use of Spotfire, Tableau or visualization libraries in R desired. - Fundamental understanding of statistics and linear algebra concepts - Fundamental understanding of building predictive models through machine learning - Good problem solving skills - Ability to explain the analysis and output - Comfortable in a team environment - Driven to acquire the new knowledge Responsibilities: - Develop a good understanding of internal systems and data sources - Perform data analysis to support various business needs - Aggregate data sources to build enablers to aid predictive model building - Create time sensitive aggregated reports for consumption by various business users - Automate report generation and work with partner teams to maintain quality of reporting - Work with partner teams to create visualization of output tailored to business users - Develop expertise in and take ownership of the data enablers created. local candidates preferred not customer facing flexible assignment hours (7-4; 8-5; 9-6) Must haves: all four Big Data – Hive, Big Data – Pig, Sqoop, Python or Java, Unix, Shell scripting. Nice to have skills: SQL, AWK. All other skills are not a requirement or nice to have.
Interested candidates can reach me at (678)-819-8520 or can mail me at xavierk@askstaffing. com