Friday, February 27, 2026

Database Applications Architect

Database Applications Architect Work Timings: From 6 PM IST - 3 AM IST [Semi-Night Shifts] Summary We're seeking a Database Applications Architect to join our development team and drive performance, scalability, and reliability across our messaging platform that serves utility companies. Our platform delivers millions of critical messages in minutes, requiring sophisticated database architecture and optimization. You'll work across our diverse technology stack with engineering leads to identify bottlenecks, improve DB application components and implement solutions that make our systems faster and more efficient. This is a database-focused application architecture role with real influence over system design and engineering practices. Key Responsibilities What you’ll do: Architectural Leadership • Analyze and optimize database interactions across our multi-database environment (Oracle, MySQL, MongoDB, DynamoDB, MSSQL Server) • Design and implement database schemas, data models, and access patterns that support high-throughput messaging operations • Re-architect portions of applications to eliminate performance bottlenecks and improve scalability • Evaluate and recommend database technologies and tools to meet evolving business requirements • Act as a technical advisor to engineering leads on data architecture decisions Performance Optimization • Identify and resolve database performance issues in production applications • Optimize queries, indexes, and database configurations to support millions of messages per minute • Design caching strategies, connection pooling, and data access patterns for high-concurrency scenarios • Conduct performance testing and capacity planning to ensure system reliability during peak loads Development Collaboration • Work closely with software engineers to design efficient data access layers • Review application code and provide guidance on database best practices • Implement monitoring, alerting, and diagnostic tools to proactively identify issues • Establish standards and processes for database development across teams Cross-Stack Problem Solving • Understand the full application stack (database, queuing architecture, frontend) to optimize end-to-end performance • Analyze how database design impacts message queuing, delivery, and overall system throughput • Bridge the gap between application development and operational requirements Qualifications Required Experience • 15+ years of hands-on experience with database application development • Deep expertise in at least 2-3 database platforms from our stack (Oracle, MySQL, MongoDB, DynamoDB, MSSQL Server) • Proven track record optimizing high-volume, high-throughput database applications • Strong SQL skills and experience with query optimization and performance tuning • Experience with both relational and NoSQL database architectures • Familiarity with cloud database services (AWS RDS, DynamoDB, etc.) and on-premises deployments Technical Skills • Expert-level understanding of database indexing, partitioning, and sharding strategies • Experience designing data models for scalability and performance • Knowledge of connection pooling, caching layers, and data access optimization • Proficiency with database monitoring and profiling tools • Understanding of distributed systems concepts and data consistency patterns • Familiarity with message queuing systems (RabbitMQ, Kafka, SQS, or similar) Preferred Qualifications • Experience with messaging or notification platforms • Knowledge of utilities industry requirements and regulatory compliance (optional) • Background in performance testing and capacity planning • Exposure to database administration tasks (configuration, deployment, backup/recovery) • Experience with CI/CD pipelines and database migration tools • Understanding of infrastructure-as-code and configuration management Soft Skills • Strong analytical and problem-solving abilities • Excellent communication skills with ability to explain complex technical concepts to diverse audiences • Collaborative mindset with experience working across development teams • Pragmatic approach to balancing architectural ideals with business realities • Self-motivated with ability to work independently and drive initiatives Our Environment Technology Stack • Databases: Oracle, MySQL, MongoDB, DynamoDB, MSSQL Server (cloud and on-premise) • Message queuing architecture supporting multi-channel delivery (SMS, email, push, etc.) • Mix of cloud-hosted and on-premise infrastructure • Modern development tools and monitoring solutions What Makes This Role Unique • High-impact work: Your optimizations directly affect millions of utility customers receiving critical communications • Technical diversity: Work with multiple database platforms and architectural patterns • Real challenges: Solve complex problems at scale where performance matters • Strategic influence: Shape the technical direction of our platform About Company We work for company Convey based in Denver, Colorado, USA. Hired Engineer will be working on payroll of AllDomainSoftware pvt. ltd. This will be full time working for Convey from home. (https://www.goconvey.com/ ). Website: http://alldomainsoft.com Address: UNIT NO 1032 10TH FLOOR TOWER, B 3 SPAZE I TECH PARK SEC 49, SOHNA ROAD, GURGAON, GURGAON, Haryana, India.

Tuesday, February 24, 2026

phData

About phData phData works exclusively in the realm of data and machine learning. By partnering with the biggest cloud data platforms (Snowflake, Azure, AWS), we’re able to help the world’s largest companies solve their toughest challenges. Our work is challenging, and our standards are high, but we invest heavily in our employees, starting with a 2–4-week bootcamp to ensure your success. Plus, you’ll get to work with the brightest minds in the industry and the best-in-class platforms on the market. And, because the data and ML industry is changing rapidly, you will always have the opportunity to learn - whether that’s a new technology, diving deeper into your preferred stack, or picking up an entirely new skill set. Even though we're growing extremely fast, we maintain a casual, exciting work environment. We hire top performers and allow you the autonomy to deliver results. Our award-winning workplace fosters learning, creativity, teamwork and diversity. 5x Snowflake Partner of the Year (2020, 2021, 2022, 2023, 2024) #1 Partner in Snowflake Advanced Certifications 600+ Expert Cloud Certifications (Fivetran, dbt, Sigma Award Winners) 7x Best Places to Work Inc 5000 Fastest Growing US Companies (2020-2023) Fueled by talented Data Scientists, Engineers, Architects, and innovative thought leaders, phData is shaping the future of data. We're looking for Solutions Architects to join our Elastic Operations team to serve as hands-on technical leaders to help customers operate and manage their next-generation cloud data platforms. Overview We are seeking qualified Solutions Architects to help deliver our Elastic Operations service from our Managed Services team in Bangalore, India, as we continue our rapid growth with an expansion of our Indian subsidiary, phData Solutions Private Limited. This expansion comes at the right time with increasing customer demand for data and platform solutions. This is a senior-level hands-on technical subject matter expert position. Hence, only experienced candidates with a deep passion for understanding and designing complex data solutions must apply. As a Solutions Architect on our Managed Services team, you will be responsible for delivering large-scale, complex data platform and infrastructure projects running on Snowflake and other native cloud platform services in AWS and Azure. In addition, you will need the ability to learn and quickly upskill on data ecosystem technologies related to data ingestion, data migration, platform design, and architecture, with some exposure to data visualization. Responsibilities: Propose, design, and provision cloud-native data solutions on AWS/Azure. Leverage a deep understanding of Snowflake, IAM, S3, EC2, Kinesis, Sagemaker, Airflow, Kafka, Azure Data Factory, ADLS, Fivetran, DBT, and/or other services and tools in designing and enhancing these solutions. Lead a technical team operating and managing modern data platforms - from streaming, to data lakes, to analytics, and beyond across a progressively evolving technical stack. Provide clear ownership of multiple simultaneous customer accounts across a variety of technical stacks as the technical leader. Skillfully navigate complex customer environments and build Epics, Stories, and Tasks to mature and improve our customer’s data platform. Delegate to and coach Engineers and ensure the successful and timely delivery of these Epics. Provide thought leadership by recommending the right technologies and approaches to maturing and solving problems to help ensure performance, security, scalability, and user satisfaction. Lead discussions and contribute to design recommendations for data model design for Data Warehouses and Data Lakes. Continually hunt for ways to automate, optimize, and expand our customer’s data platform and service offering. Required Experience: Production support of enterprise data platforms such as Snowflake, AWS, Azure, or Databricks. Extensive experience in providing architectural guidance and operational support across a large user base for a cloud-native data warehouse (Snowflake and/or Redshift) with a background in Systems Engineering, DevOps, DBA or Data Architect Production experience working with Cloud and Distributed Data Storage technologies such as S3, ADLS, HDFS, ElasticSearch/Solr, Cassandra or other NoSQL storage systems. Production experience working with Data integration technologies such as Spark, Kafka, event/streaming, Fivetran, HVR, NiFi, AWS Data Migration Services, Azure DataFactory or others. Production experience working with Workflow Management and Orchestration such as Airflow, AWS Managed Airflow, Luigi, NiFi. Strong working knowledge of SQL and the ability to write, debug, and optimize SQL queries. Deep expertise with cloud-native data technologies in AWS or Azure. Professional track record of creating, challenging, and improving processes and procedures. Unmatched troubleshooting and performance tuning skills. Proveteam lead and mentorship of engineers. Passion for learning new technology stacks n experience as a technical and up-skill/training team member. Create and deliver detailed technical presentations for an executive audience. Excellent client-facing written and verbal communication skills and experience. Preferred Experience: Extensive experience with infrastructure as code using Terraform or Cloud Formation. Proven expertise using Python to build automated solutions for repetitive support tasks. Well-versed with continuous integration and deployment frameworks with hands-on experience using CI/CD tools like Bitbucket, Github, Flyway, and Liquibase. Snowflake SnowPro Core certification Bachelor's degree in Computer Science or a related field Ideal opportunity if you enjoy… Seeing the big picture. You recognise that data challenges are multifaceted, and you understand the technical, business, operational, and human issues at play. Fast-paced operations work. You get energized by working across multiple projects and customers at once as requirements and situations evolve. Solving complex problems. You’re eager to take on big issues that have stumped some of the world’s biggest companies. Venturing boldly into uncharted territory. When facing a problem you haven’t seen before, you enjoy the challenge of navigating through rocky terrain to build an ideal solution. Working across technology stacks and organizational silos. You’ve seen what happens when people get stuck in one platform, application, or way of thinking. You can integrate disparate functions and technologies to make your solutions work. Thinking on your feet. You don’t get rattled by the unexpected. You quickly assess the situation and formulate a plan of action. Following up and following through. Not only can you envision the solution, but you can break it into actionable steps and keep pushing toward the final outcome, no matter what gets in the way. Managing the human side. You’ve got the presence and poise to work confidently and respectfully with your teammates, your customers, and more—even when it’s time for difficult conversations. Collaborating with others. Being yourself. You value an environment where it’s safe to ask questions, take calculated risks, and be authentic. You support your teammates and expect others to do the same. ​​Perks and Benefits: Medical Insurance for Self & Family Medical Insurance for Parents Term Life & Personal Accident Wellness Allowance Broadband Reimbursement Professional Development Allowance Reimbursement of Skill Upgrade Certifications Certification Reimbursement

Wednesday, February 18, 2026

Meera.ai Final

Job Responsibilities Company Link: Meera.ai Position-Data Administrator / Database Administrator (DBA) Location: Remote Reports to: CTO Job Summary: The Data Administrator will own the foundation of Meera.ai’s data ecosystem ensuring that platform, client, and operational data are accurate, consistent, secure, and optimized from ingestion through reporting. This role is critical to maintaining high data integrity across AWS, Redshift, and internal systems, enabling fast, reliable insights in Tableau and other analytics tools. The ideal candidate will combine deep technical expertise in database and data warehouse management with a strong understanding of data governance, compliance, and performance optimization. This position will be instrumental in defining and enforcing best practices for data modeling, ETL/ELT architecture, and BI enablement across the organization. Key Responsibilities 1. Data Infrastructure Design & Management Design, implement, and manage AWS-based data infrastructure including Redshift, RDS, and S3 for scalable, high-performance storage and querying. Develop and maintain data pipelines for ingesting, transforming, and storing large-scale platform data (usage, messaging, campaign, and customer metrics). Implement and maintain database objects (schemas, tables, views, indexes) optimized for analytical workloads. 2. Data Governance, Security & Compliance Establish and enforce data governance policies, ensuring adherence to Meera.ai’s compliance requirements (TCPA, FERPA, SOC2, HIPAA where applicable). Manage data access, permissions, encryption, and auditing across AWS and Redshift environments. Maintain data dictionaries, lineage documentation, and metadata standards to ensure transparency and reliability across teams. Implement row-level and column-level security for sensitive datasets (e.g., client PII, SMS logs, and compliance reporting). 3. Performance Optimization & Reliability Monitor and tune Redshift performance, managing WLM queues, vacuuming, sort keys, and distribution strategies. Optimize ETL jobs for performance and cost efficiency across AWS services (Glue, Lambda, Step Functions). Manage query optimization, resource utilization, and scaling configurations. Conduct regular system health checks, capacity planning, and cost monitoring. 4. Data Quality, Integrity & Validation Define and automate data validation and cleansing routines across ingestion points to ensure accuracy and consistency. Oversee master data management (MDM) processes for client accounts, campaigns, and lead data. Identify and resolve data discrepancies between source systems (e.g., Twilio logs, CRM, billing, and platform usage data). 5. Backup, Recovery & Security Develop and manage automated backup, restore, and disaster recovery strategies across AWS environments. Ensure redundancy and data durability through appropriate S3 versioning, snapshots, and archival policies. Conduct regular security audits and penetration tests for data systems. 6. BI & Reporting Enablement Serve as the primary liaison between the data warehouse and the BI team, enabling Tableau dashboards that drive platform health, campaign performance, and financial insights. Design and maintain semantic models, views, and data marts for self-service analytics. Support business units in defining KPIs, data definitions, and custom report logic. Automate Tableau extract refreshes and performance optimizations. 7. Collaboration & Process Standardization Work closely with Engineering, Product, Finance, and Customer Success teams to align on data strategy, governance, and platform metrics. Create internal documentation, SOPs, and version control processes for database changes. Drive best practices in data lifecycle management, cost control, and system scalability. Mentor team members on querying best practices, data visualization standards, and efficient resource use. Qualifications Bachelor’s degree in Computer Science, Data Engineering, or related field (Master’s preferred). 5+ years of experience in database administration, data engineering, or data warehouse management. Expertise in Amazon Redshift, AWS RDS, S3, and related AWS data services (Glue, Lambda, CloudWatch, IAM). Strong SQL and data modeling skills (star schema, snowflake, normalization). Hands-on experience with ETL/ELT orchestration (e.g., Airflow, dbt, Step Functions). Experience integrating data from multiple sources including CRMs, marketing automation, and Twilio messaging data. Proficiency in Tableau (data sources, extracts, performance tuning, user permissions). Strong understanding of data security, privacy, and compliance frameworks (TCPA, FERPA, HIPAA, SOC2). Excellent analytical and communication skills, with the ability to translate technical details into business insights. Preferred Skills (Nice to Have) Experience with Python for ETL scripting or automation. Familiarity with data catalog and observability tools (e.g., AWS Glue Data Catalog, Monte Carlo, DataHub). Understanding of ASC-606 and revenue reporting data structures for SaaS metrics. Experience building usage-based billing dashboards and data quality monitors. Knowledge of machine learning data preparation or predictive analytics models. Impact of the Role The Data Administrator ensures that Meera.ai’s data from inbound leads and message logs to campaign performance and billing is accurate, secure, and actionable. This role provides the backbone for executive dashboards, customer analytics, and compliance visibility, directly influencing product decisions, financial forecasting, and customer success outcomes.