Paweł Oljasz

About Me

Engineering Leader | Infrastructure & Security | AI Systems

Technology executive and hands-on engineer with over 15 years in infrastructure, cybersecurity, and software delivery. I lead a team of highly qualified specialists across DevOps, security, and platform engineering - people I trust to own complex problems end to end.

My background spans enterprise-scale platform engineering, security architecture, and more recently AI infrastructure. I've designed and operated systems handling millions of transactions, built security programs from scratch, and shipped products across regulated industries. I stay technical because leaders who stop building stop understanding what their teams actually need.

Areas of Focus

Infrastructure & Platform Engineering

  • Distributed computing - bare metal clusters, cloud-native architectures, hybrid environments
  • Container orchestration and microservices at scale
  • Observability platforms - designing monitoring that engineers actually use
  • CI/CD pipelines, GitOps workflows, Infrastructure as Code

AI & GPU Computing

  • NVIDIA GPU infrastructure - CUDA, multi-node inference, power-aware scheduling
  • LLM deployment and optimization across heterogeneous hardware
  • AI safety engineering - prompt injection detection, content classification, audit trails
  • Distributed inference across Apple Silicon and NVIDIA platforms

Cybersecurity

  • Security architecture and threat modelling
  • Network security, DNS monitoring, and intrusion detection
  • Application security - from code review to production hardening
  • Compliance frameworks and security program development

Engineering Leadership

  • Building and growing high-performing engineering teams
  • Technical strategy - aligning architecture decisions with business outcomes
  • Cross-functional collaboration between engineering, product, and security

The Hardware Side

Outside of enterprise software and cloud infrastructure, I have a deep personal interest in hardware, embedded systems, and radio. This isn't something that shows up on a CV or a LinkedIn profile, but it's a big part of how I think about technology and it directly influences the way I approach system design at every level.

I've been working with Arduino boards since the early days of the platform. What started with blinking LEDs turned into building environmental sensor networks, custom motor controllers, and automated irrigation systems. Arduino taught me how to think about constraints - when you have 2KB of RAM and a 16MHz clock, every byte and every cycle matters. That discipline carries over into how I write backend services and think about resource allocation in production systems.

Raspberry Pi opened up a different world. I run several of them for home automation - controlling heating zones, monitoring power consumption, managing network infrastructure. The FutureGrid edge devices are built on embedded Linux boards. Getting a reliable, secure, self-updating system running on a single-board computer taught me more about Linux internals than any server-side work ever did. When you're debugging a kernel panic on a headless device sitting in someone's electrical cabinet, you learn to write software that doesn't crash.

FPGAs are where it gets really interesting. I've been experimenting with Xilinx and Lattice boards for signal processing and custom protocol implementations. The mental model for FPGA development is completely different from software - you're designing hardware in code, thinking about clock domains, timing constraints, and parallel execution at the gate level. I've used FPGAs for high-speed data acquisition and custom communication interfaces where a microcontroller would be too slow and an ASIC too expensive.

Software Defined Radio (SDR) is another rabbit hole I fell into a few years ago. Started with an RTL-SDR dongle listening to aircraft transponders and weather satellites. Moved on to HackRF and LimeSDR for wider bandwidth work. I've built custom decoders for LoRa packets, analyzed IoT device transmissions for security research, and experimented with building my own LoRaWAN gateways from scratch. Understanding RF and modulation at a low level changed how I think about wireless security - most IoT protocols are far more vulnerable than people realize, and you can't defend what you don't understand.

LoRaWAN specifically ties into the FutureGrid work. Long-range, low-power communication is perfect for energy monitoring in locations where WiFi doesn't reach - underground cable vaults, remote transformer stations, distributed solar installations across agricultural land. I've deployed custom LoRa sensor nodes that run for months on a single battery, reporting temperature, humidity, and current measurements back to a central gateway.

Home automation is where all of this converges. My house runs on a mix of custom-built and commercial systems - Zigbee sensors, ESP32 controllers, custom PCBs for specific applications, all feeding into a central automation platform. I've built custom circuits for monitoring washing machine cycles, controlling underfloor heating valves, and automating garden irrigation based on soil moisture sensors and weather forecasts. Some of it is overengineered. All of it works.

I'm also interested in open hardware platforms and the right-to-repair movement. I try to build things that can be fixed, modified, and understood. When I design a PCB or write firmware, I document it as if someone else will need to maintain it - because eventually someone will, and that someone is usually future me at 2 AM wondering why the heating controller stopped responding.

3D printing is another tool in the workshop. I run a Bambu Lab P2S which I use for printing custom enclosures for sensor nodes, mounting brackets for cameras and antennas, cable management parts for the server rack, and prototype housings for PCB projects. When you need a specific bracket to mount an SDR antenna at exactly the right angle or a weatherproof case for an outdoor LoRa node, being able to design it in CAD and have it printed in an hour changes how you approach hardware projects. I've also printed replacement parts for household appliances that would have otherwise ended up in a landfill - which ties back to the right-to-repair philosophy.

This hardware background shapes everything I do in software. When I design a monitoring system, I think about sensor placement and data collection intervals because I've built actual sensors. When I architect a distributed system, I think about network partitions because I've dealt with LoRa packets that don't arrive. When I talk about edge computing, I mean actual edge devices sitting in the field, not just a CDN node in a data center.

Current Work

Outside of my day job, I run OLAB - a personal AI infrastructure lab that serves as the backbone for all my side projects. The cluster consists of a distributed Mac Mini setup for pooled inference and a high-end NVIDIA GPU server for heavy compute, backed by a full security and observability stack built entirely from open-source tools.

All development, testing, and staging for my projects runs on OLAB. FutureGrid - the energy management platform I'm building with a team of energy sector engineers - uses the cluster for CI pipelines, integration testing, and staging before production deployments. WordHunt - an educational word game platform with 22 milestones of features - runs its WebSocket load tests and NLP pipeline training on the same infrastructure.

I treat the lab as a testing ground for every tool I come across. If there's an interesting open-source project, a new monitoring approach, or a framework I want to evaluate for a client recommendation - I deploy it on OLAB first, break it, fix it, and only then suggest it to others. Nothing I recommend to my team or clients is something I haven't run myself.

It's part learning lab, part reference architecture, part proving ground. The write-ups are on my blog.

Contact