AI-Native Software Engineer
Role details
Job location
Tech stack
Job description
- Architect and Build Scalable Systems
- Design and implement high-quality backend systems in .NET and C#.
- Build modular, testable, maintainable architectures.
- Make sound trade-offs between performance, maintainability, and speed of delivery.
- Refactor legacy systems intelligently (not cosmetically).
- Leverage AI Across the SDLC
- Use LLMs and code-generation tools to:
- Draft and refactor production code
- Generate unit/integration tests
- Create migration scripts
- Perform static analysis and code reviews
- Produce technical documentation
- Build internal AI assistants for:
- Log analysis
- Support ticket triage
- Codebase navigation
- Technical debt discovery
- Continuously optimize development workflows using AI.
- Own Architecture Decisions
- Design APIs (REST, event-driven, microservices where appropriate).
- Work with cloud-native patterns (Azure preferred).
- Integrate AI services into production systems.
- Evaluate when to build vs. buy vs. automate.
- Raise the Bar
- Set standards for AI-assisted development.
- Mentor other engineers in AI-native workflows.
- Push for measurable productivity gains.
- Eliminate manual processes wherever possible., TrialMaster, IRMS MAX, and TA Scan, the company's flagship products, reduce complexities in the drug and device discovery and commercialization process allowing customers to enhance the quality of their patients' lives.
Requirements
We are hiring an AI-native software engineer who treats large language models, copilots, and agentic workflows as core parts of their development environment, not optional add-ons. You will design, build, and refactor production systems in .NET/C#, while using the latest AI tooling to dramatically increase speed, quality, and leverage. You are not just a developer. You are an architect, systems thinker, and automation builder., Strong Experience With
- .NET
- C#
- ASP.NET Core / .NET Web APIs
Solid Understanding Of
- Entity Framework / ORM patterns
- SQL and database design
- Clean architecture principles
- Dependency injection
- Testing frameworks (xUnit, NUnit, etc.)
Cloud & DevOps
- Azure experience preferred
- CI/CD pipelines
- Infrastructure as Code familiarity
- Observability (logging, tracing, monitoring)
AI & Automation
Hands-on Experience With
- LLM APIs
- Code copilots and AI IDE tooling
- Prompt engineering for engineering workflows
Experience Building
- AI-assisted tools
- Internal bots or automation agents
Understanding Of
- Model limitations
- Cost-performance trade-offs
- Guardrails and reliability
What We're Looking For
- Technical Depth
Benefits & conditions
This is not just "comfortable with GitHub Copilot/Claude Code/Codex." You:
- Use AI daily to generate, refactor, test, document, and review code.
- Build internal AI agents and workflows to automate repetitive engineering tasks.
- Design systems assuming AI components will exist in the architecture.
- Know when AI is wrong - and can spot hallucinations instantly.
- Continuously experiment with new tools and replace old workflows aggressively., * Read a 200k+ LOC codebase and map it quickly
- Identify architectural rot fast
- Make systems simpler, not more complex
- Speed with Judgment
You move fast but you don't create messes others have to clean up.
- AI Skeptic + Optimist
You
- Aggressively use AI
- But never blindly trust it
- Know how to validate outputs
- Treat AI as a junior developer that needs supervision
- Builder Mentality
You don't wait for perfect specs. You prototype, validate, iterate.
Nice to Have
- Experience modernizing legacy .NET Framework systems
- Experience integrating AI into enterprise workflows
- Background in regulated or compliance-heavy industries
- Frontend experience (React/Blazor) a plus but not mandatory
How We Measure Success
Within 6-12 months, you will:
- Increase engineering throughput measurably using AI-assisted workflows
- Reduce technical debt through intelligent refactoring
- Ship production-grade AI-enhanced features
- Help build an engineering culture where AI is standard, not experimental
- Deliver more output per engineer without compromising system integrity