Be on Alert – Fraudulent Employment Offers. Learn More

Bangalore, India
JOB #R2101287

Software Architect , Data Platforms

ICF (NASDAQ:ICFI) is a global consulting services company with over 7,000 full- and part-time employees, but we are not your typical consultants. At ICF, business analysts and policy specialists work together with digital strategists, data scientists and creatives. We combine unmatched industry expertise with cutting-edge engagement capabilities to help organizations solve their most complex challenges. Since 1969, public and private sector clients have worked with ICF to navigate change and shape the future. Learn more at

ICF Next

ICF Next, Inc. (“ICF NEXT”) is a global marketing company. We provide marketing and communications capabilities to our customers. Over the years, our company has built and integrated a set of best-in-class marketing and communications through different agencies and consultancies. ICF NEXT brings organizations closer to the people they serve. Our focus in on the insights, creativity and technology that improve the interaction with clients and motivates meaningful action. 

With a passion for marketing and communication, ICF NEXT knows when and how to accelerate the adoption of technologies and techniques that bring you closer to your customer. As voice search, artificial intelligence, and virtual and augmented reality are disrupting nearly every industry, we help organizations to stay one step ahead by orchestrating the conversations and collaborations that produce innovation. With over 1,700 staff members and more than 15 global offices, we are a global strategic partner for engagement and transformation. For more information about our company, visit

Job Location : - Bangalore

Position Title:  Senior Software Developer, Data Platforms

Position Summary:

The enterprise data platforms team requires a senior software developer to help evolve the Enterprise Data Lake, Operational Database, Data Quality/Testing/Validation and related big data-based or Cloud initiatives.  This individual will uniquely contribute to the success of both our product and data teams by being flexible and responsive to project requests while delivering high quality solutions and services that support new demands for timely, integrated and reliable data while integrating to and preserving critical legacy components of the architecture.

As developer on this team you will collaborate with cross functional teams across the organization.  You will be responsible for code and various data management activities that meet project and organizational requirements through collaboration with application developers, data and solution architects, devops engineers, scrumasters, product owners, QA, technical directors and account managers. The successful candidate will be adept at working in a dynamic environment, know how to deal with ambiguity and have a determined nature to learn, troubleshoot code and complete tasks that involve different aspects of multiple large distributed database systems.

Performance Objectives:

  • Develop an understanding of the Tally platform and Operational/Analytics needs within 30 days
    • Understand the software stack as a whole, the systems and data architecture, ETL processes and frameworks and maintenance practices.
    • Understand application interactions with the database.
    • Understand the project’s scope in order to identify when requirements are out of scope and require a change order.
    • Understand ICF Next’s documented source control and branch management guidelines.
    • Demonstrate understanding of existing streaming/ingest pipelines and tenant implementation of Data Lake zones.
  • Help drive and maintain high quality standard of data architecture and database code changes within Microsoft SQL Server, Hadoop/Big Data Ecosystem, Elasticsearch, and Cloud
    • Perform all phases of data engineering including requirements analysis, application design, code development and testing.
    • Use previous experience to evolve the data platform and expand use cases within Hadoop services and the Enterprise Data Lake across multiple functional tenant needs
    • Respect, implement and manage strong data governance and security practices especially with respect to the Data Lake
    • Adhere to ICF Next’s security practices and perform regular security reviews throughout project life cycle
    • Estimate engineering work effort and effectively identify and prioritize the high impact tasks.
    • Troubleshoot production support issues and identify solutions as required to backup the team for Operational activities.
    • Ensure code is efficient and optimized for best performance.
    • Ensure that objects are modeled appropriately.
    • Review and test code changes in lower environments.
    • Lead and evangelize good testing and automation practices that foster a quality control mindset with new and existing team members
    • Understand, manage and troubleshoot jobs and monitoring software while contributing scripts to improve predictability of system health, database or Hadoop cluster. 
  • Effectively work with the team and team workflow toolset to manage communication, status, issues and code quality.
    • Should have basic experience with Atlassian tools, like SourceTree/Bitbucket, JIRA and Confluence.
    • Create JIRA Tickets with enough information for the development team to estimate and resolve issues in a timely manner.
    • Competently use version control (Git) to manage topic branches and Pull Requests.
    • Follow ICF Next’s documented source control and branch management guidelines.
    • Review code and provide feedback relative to best practices and improving performance.
    • Create and update tickets with enough information for the team to estimate and resolve issues in a timely manner.
    • Follow build and automation practices to support continuous integration and improvement.

Required Qualifications:

  • 7+ years of experience with SQL Server (T-SQL) versions 2016+ in medium-large database implementations or comparable RDBMS.
  • Demonstrated success migrating data between disparate systems.
  • Unique ability to rapidly adopt the latest data engineering and testing automation technologies in support of the data architecture.
  • Ability to mentor and share knowledge effectively to help expand expertise throughout the organization
  • Ability to lead and coordinate remote development teams
  • Experience with at least some of the Big Data and NoSQL technologies like Hadoop/hdfs, Sqoop, Hive, Pig, Kafka, Storm and Spark.
  • Knowledge of Hadoop file formats (e.g. Avro, Parquet, Orc, etc.) and their applicable use cases.
  • Champion for enforcement of data management and engineering best practices and keen focus on Data Lake organization and management of disparate data sources for Intake, Integration/Aggregation and Consumption of the Data Lake.
  • Strong commitment toward preservation of data lineage, quality and integrity.
  • Ability to drive positive development testing standards and practices within the team to include unit testing, integration testing, and performance testing as needed to ensure optimal data management, governance and delivery.
  • Understanding of OLTP, OLAP/Data Warehouse (star schema) and mixed workloads and when to best implement each as an architectural pattern.
  • High level of competency writing SQL queries and with relational database modeling and design.
  • A solid grasp of the Git version control system.
  • Ability to learn and expand use of Powershell to manage and monitor databases.
  • Ability to apply quality assurance principles to the data engineering and architecture disciplines.
  • Experience with Elasticsearch and integration to SQL Server or Hadoop ecosystem and components.
  • Basic knowledge of SQL Server Integration Services (SSIS) and SQL Server Reporting Services (SSRS).

Bonus Qualifications:

  • Scaled Agile Framework (SAFe)
  • Cloud Data (especially AWS)
  • Python/Scala/Java, Spark
  • Analytics/Data Science
  • Additional SQL RDBMS (postgresql,mysql, …)
  • Additional Document-store (mongodb, …)

Working at ICF

Working at ICF means applying a passion for meaningful work with intellectual rigor to help solve the leading issues of our day. Smart, compassionate, innovative, committed, ICF employees tackle unprecedented challenges to benefit people, businesses, and governments around the globe. We believe in collaboration, mutual respect, open communication, and opportunity for growth. If you’re seeking to make a difference in the world, visit to find your next career. ICF—together for tomorrow.

Bangalore, India (II76)

Who is ICF?

A global consulting services company with +7,000 people across +70 countries, but we are not your typical consultants.

More jobs you might like

Jul 23, 2021
Bangalore, India
Jul 20, 2021
Bangalore, India
Jul 20, 2021
Fairfax, Virginia, United States of America
Jul 20, 2021
Bangalore, India
Jul 20, 2021
Multiple locations
Jul 16, 2021
Multiple locations
See All Jobs

Join our talent network

ICF is growing, and we add new open roles to our site regularly. If you're waiting for that perfect opportunity at ICF or want an inside look at what it's like to do world-changing work, join our talent network to stay updated.

Join our talent network

ICF is growing, and we add new open roles to our site regularly. If you're waiting for that perfect opportunity at ICF or want an inside look at what it's like to do world-changing work, join our talent network to stay updated.