Actively Hiring
Join Our Team at IExcel Tech
Technical Expertise - Hiring Verticals
At IExcel Tech, we hire top talent across diverse verticals, including IT, healthcare, finance, and administrative roles. Our specialized recruitment ensures you find the right professionals for any industry.
Java Back End Developer
Job Title : Senior Java Developer with Spring Boot (Hybrid Role)
Mode of Hire: W2, US Citizens / GC / EAD ONLY at Client in Irving, TX
Mode of Hire: C2C at Client in NYC,NY
Locations: Irving, TX | NYC, NY
Rate: $60 - $70 perHour
Overview:
We are seeking a highly skilled Senior Java Developer with strong experience in Spring Boot to join our dynamic team. The successful candidate will be responsible for designing, developing, and maintaining high-performance, scalable Java applications using Spring Boot and other related technologies. This is a hybrid role, requiring the candidate to work onsite for 3 days per week.
Requirements:
- Minimum 5+ years of hands-on experience in Java development and at least 3 years of experience with Spring Boot.
- Strong knowledge of Spring framework, including Spring Core, Spring MVC, Spring Data, and Spring Security.
- Proficient in RESTful web services and experience with API design and development.
- Experience with database technologies such as Oracle, MySQL, or PostgreSQL, and ORM frameworks like Hibernate or JPA.
- Proficient in Git, Maven, or Gradle for build and dependency management.
- Excellent problem-solving and debugging skills.
- Excellent written and verbal communication skills.
Preferred Qualifications:
- Experience with microservices architecture and containerization using Docker and Kubernetes.
- Knowledge of cloud platforms such as AWS, Azure, or Google Cloud.
Responsibilities:
- Design and develop high-quality Java applications using Spring Boot and related technologies.
- Collaborate with cross-functional teams to identify and solve complex business problems.
- Implement and maintain RESTful web services and APIs.
- Write clean, reusable, and maintainable code following best practices and industry standards.
- Participate in code reviews and contribute to the continuous improvement of the development process.
- Stay up-to-date with emerging trends and technologies in Java and Spring ecosystems.
Scala Developer With Java
Mode of Hire : C2C, Prefer US Citizens and GC but H1s OK
F2F Interview : Need to come onsite for interview
Client Location: NYC, NY
Rate: $60 - $70 perHour
Overview:
We are seeking a skilled Scala Developer with strong experience in Functional Programming and also in Java. Candidate must be well versed with Microservices. This is a hybrid role, requiring the candidate to work onsite for 3 days per week.
Requirements:
- Minimum 5+ years of hands-on experience in Java development and Spring Boot and at least 3 years of experience in Scala.
- Strong knowledge of Spring framework, including Spring Core, Spring MVC, Spring Data, and Spring Security.
- Proficient in RESTful web services and experience with API design and development.
- Experience with database technologies such as Oracle, MySQL and distributed caching.
- Exposure to Kafka and Middleware MQ.
- Experience in Linux Environment.
- Proficient in Git, Maven for build and dependency management.
- Excellent problem-solving and debugging skills.
Skills Desired:
- Experience working in an agile team using Agile and DevOps practices & tools
- Experience working with Continuous Integration systems
- Experience with cloud-ready development, Docker containers
- Experience with Confluent Kafka
- Experience with Redis
- Experience with automation and scripting languages (Python preferred)
- Experience in Data Modeling.
Data Modeler/Analyst
Mode of Hire : W2, US Citizens / GC / EAD ONLY
F2F Interview: No
Client Location: Quincy, MA (Hybrid 2-3 Days/Week)
Rate: $60-70 USD per Hour (Without Benefits)
Duration: Long Term
Education: BS / MS in Computer Science Or Related
Overview:
Data Modeler with Data Analysis and Spark Expertise aid proficiency in Spark/Scala, Bigdata, Python, SQL, Cloud.
Requirements:
- Must have 5+ Years of data analyst experience.
- Candidate should be able to write SQL/NonSQL queries.
- Knowledge of Spark/Scala/parquet file structure is expected.
- Experience with Financial compliance domain is a BIG PLUS.
- Data Modeling Experience
- Strong communication and analytical skills are important to be successful in this role.
Skills Desired:
- Design and develop robust data models using tools such as Erwin Data Modeler or IBM Data Architect, aligning with business requirements and industry best practices.
- Conduct thorough data analysis using tools like SQL, Python, or R to identify patterns, trends, and insights, ensuring high data quality and integrity.
- Hands-on experience with Apache Spark, leveraging its capabilities for efficient data processing and analytics tasks. Proficiency in related tools like Databricks is a plus.
- Collaborate with cross-functional teams to understand data requirements and contribute to the development of scalable and efficient data solutions.
Databrick Engineer
Mode of Hire : W2, US Citizens / GC / EAD ONLY
F2F Interview: No
Client Location: Quincy, MA (Hybrid 2-3 Days/Week)
Rate: $60-70 USD per Hour (Without Benefits)
Duration: Long Term
Education: BS / MS in Computer Science Or Related
Overview:
Candidate must be well versed in Databrick, AWS, Pyspark and Scala with 5+ years of experience in cloud.
Requirements:
Recognize the current application infrastructure and suggest new concepts to improve performance.
Document the best practices and strategies associated with application deployment and infrastructure support.
Produce reusable, efficient, and scalable programs, and also cost-effective migration strategies.
Develop Data Engineering and ML pipelines in Databricks and different AWS services, including S3, EC2, API, RDS, Kinesis/Kafka and Lambda to build serverless applications.
Work jointly with the IT team and other departments to migrate data engineering and ML applications to Databricks/AWS.
Comfortable to work on tight timelines, when required.
Skills Desired:
Solid programming background on Scala, Python.
Solid understanding of Databricks fundamentals/architecture and have hands on experience in setting up Databricks cluster, working in Databricks modules (Data Engineering, ML and SQL warehouse).
Knowledge on medallion architecture, DLT and unity catalog within Databricks.
Experience in migrating data from on-prem Hadoop to Databricks/AWS.
Understanding of core AWS services, uses, and AWS architecture best practices.
Hands-on experience in different domains, like database architecture, business intelligence, machine learning, advanced analytics, big data, etc.
Solid knowledge on Airflow.
Solid knowledge on CI/CD pipeline in AWS technologies.
Application migration of RDBMS, java/python applications, model code, elastic etc.
Experience with Docker and Kubernetes is a plus.
Senior Data Engineer Lead
Mode of Hire : W2, US Citizens / GC / EAD ONLY
F2F Interview: No
Client Location: San Francisco, CA (Hybrid 2-3 Days/Week)
Rate: $65-70 USD per Hour (Without Benefits)
Overview:
Candidate must be well versed in Scala, Java with developing applications using Apache Spark framework.
Requirements:
- True Hands-On Developer in Programming Languages like Java or Scala.
- Expertise in Apache Spark Database modeling and working with any of the SQL or NoSQL Database is must.
- Working knowledge of Scripting languages in one of the Shell/Python.
- Experience of working with Cloudera is Preferred.
- Orchestration tools like Airflow or Oozie would be a value addition.
- Knowledge of Table formats like Delta or Iceberg is plus to have.
- Working experience of Version controls like Git, build tools like Maven is recommended.
- Having software development experience is good to have along with Data Engineering experience.
ETL Developer (Canada)
Mode of Hire : C2C
F2F Interview: Need to come onsite for interview
Client Location: Montreal, Canada
Rate: $45 - $55 USD per Hour
Overview:
To support the complexity and diversity of our business, a strong governance, control and risk management program is required. The Operations group is a core part of this effort in helping the Firm identify and track the effectiveness of key protocols supporting this effort. Reconciliation team partnering with Global operations group focuses on providing smart, cost effective, optimal and holistic solutions to the Operations groups across various business units. This enables them to perform various checks and controls to ensure the accuracy of the Firms Books and Records along with mitigating risks to the Firm (Settlement, Operational and Systemic).
Requirements:
- Design, coding, testing, tuning and maintenance of Informatica work flows
- Work directly with Modelers/Architects/DBA on data modeling, logical/physical schema design, SQL query tuning; and the application of data-related business rules while assisting in troubleshooting, debugging, and issue resolution
- Communicate effectively with end users, peers, and management regarding assignments, and must have a can-do attitude and a desire to provide solutions
- Delivering projects with a high-level of data quality and integrity
- Meet deadlines and possess a proven record of successful project execution and delivery.
- Demonstrate knowledge of SDLC processes and procedures adjusted for a Rapid Application Development methodology (agile)
- Provide support for the resolution of all reporting issues escalated by the Support Operations Team post-implementation.
- Working with global development team to have designs implemented, tested, productionalized and present to user groups.
- Influencing decisions with divisional partners, internal teams, senior management, and external partners.
- Collaborate with the team to define the information architecture, advocating interaction design best practices with a focus on consistency and usability
Skills Desired:
- 3-6 years of development experience in Informatica
- Extensive experience in designing and developing of complex mappings, applying various transformations such as lookup, source qualifier, update strategy, router, sequence generator, aggregator, rank, stored procedure, filter, joiner and sorter transformations
- Experience in integration of various data sources like DB2, SQL Server, and Flat Files into the staging area
- Expert in designing Parallel jobs using various stages like Join, Merge, Lookup, Remove duplicates, Filter, Dataset, Lookup file set, Modify, Aggregator, XML parsing stages. Good experience with ETL tools: Informatica or SSIS/Data Stage
- Good knowledge in identifying performance bottlenecks and also in tuning the Mappings and Sessions
- Expert knowledge of DB2, SQL Stored Procedures, Triggers, and data querying optimization
- Extensive experience working on UNIX platform, Perl scripting and managing Autosys jobs.
- Excellent communication, interpersonal, analytical and leadership skills, self-motivated, quick learner and a team player Ability to work in a team and individually, independent thinker, focused on meeting deliverables in a remote environment
- Ability to effectively interact with, present information to, and respond to questions from all levels of the organization and partners.
- Ability to communicate complex analysis in a clear, precise, and actionable manner
- Comfortable with ambiguity, creative thinking and leading change
- Financial Services experience, will be a strong plus.