Mayur Pal // Dev_OS|Status: Live_Ready
Loc: Jalandhar, INv1.0.4

Status:Available for Q3 Projects // Specialized in SaaS & Scalable Architecture

Fullstack Power.
DevOps Precision.

I’m Mayur Pal. I build high-performance applications using Go, Next.js, and Postgres.

"I architect resilient digital infrastructure. From rapid MVP development to scaling distributed systems, I ensure your product stays up when your traffic spikes."

START A PROJECTResume.pdf
root@thissidemayur:~
Golang / TS
D
Docker / AWS
Postgres / Redis
CI/CD Actions
Next.js 16
Linux / Bash

Production Stack

Engineered for Scale

Tech_Stack_Engine

Verifying_Production_Dependencies...

Live Dashboard

Portfolio Statistics

Data cached for 15m
Total Projects
05
Blog Posts
05
Certifications
06
Recieved Messages
13
Available NowActive Status

Career Status

Open to
Opportunities
FreelanceInternshipFull-time
Avg. Response Time
Under 24h
Away
Based in India

Operating From

Jalandhar,
Punjab
UniversityLPU, CSE '26
Local Time
--:--
Mayur PAL

Architecting resilient systems and high-performance digital experiences with a focus on cloud-native infrastructure.

Sitemap

  • Projects
  • Blogs
  • About

Connect

© 2026 All Rights Reserved|Status: Operational
Next.js 16•AWS_MUM_Node•v2.0.0
Home
About
Projects
Certs
Blog
Resume
Contact
READY
Work With Me

ENGINEERING Blogs

Documentation of architectural decisions and system design patterns.

TECHNICAL

I Built a Terminal AI Chat App Using Docker — No API Keys, No Cloud

In the final part of this series, I build a complete terminal AI chat app in TypeScript using Docker Model Runner. One binary called llm that anyone can run — streaming responses, conversation memory, preset modes, auto model detection, and history saved to disk. I walk through the full architecture, every design decision, and what I learned. Plus — why I'm migrating this to Go next.

8 MIN_READ
—Mar 2026
OPEN_LOG
TECHNICAL

Talking to Your Local AI Through Code — DMR REST API + TypeScript

In Part 1 we set up Docker Model Runner. Now we actually use it. In this post I walk through calling your local LLM programmatically using TypeScript and the OpenAI SDK — no API key, no cloud. We build three scripts from scratch: a single question chat, a streaming response that feels like ChatGPT, and a multi-turn conversation with memory. Plus — DMR's native endpoints for managing models directly from your code.

8 MIN_READ
—Mar 2026
OPEN_LOG
TECHNICAL

Run AI Locally for Free — No API Keys, No Cloud, No Limits

As a CS student, I wanted to experiment with LLMs without paying for API access or sending my data to the cloud. Turns out, if you already have Docker installed, you're 90% there. This is Part 1 of a series where I set up Docker Model Runner, explain the GenAI vocabulary that confuses everyone (parameters, quantization, context window), and get a 3B model running locally on my laptop — for free.

8 MIN_READ
—Mar 2026
OPEN_LOG
EXPLORE ALL Blogs

Certifications

01 / 06
Practise Go
CodeChef

Practise Go

Issued: May 2025
VERIFY_CREDENTIAL
System_Status: Open_for_Collaboration

Ready to scale
your_vision.

Currently accepting high-impact projects and full-stack engineering roles. Let's talk infrastructure.

Drop_an_Email
Book_a_Call