Profile
AndreyBoyko
Average Review
0.00
Reputation
22
Finished gigs
0
Finished jobs
0
Website
glivera-team.comCountry
Not specified
Skills
Specialization
AI Automation Engineer | AI Agents, n8n, CRM & Business Automation
Hourly rate
$60/hr
Preferred payment options
USDT
AUDT
USDC
DAI
USDT
USDC
DAI
BUSD
DAI
About me
🙋 I am a AI Automation Engineer with 190 delivered projects, $540K+ earned, and a 100% Job Success Score. I build production-grade automation systems that reduce manual work, connect fragmented tools, and improve business operations. I focus on solving key bottlenecks, not just linking apps. I design AI agents, workflow automations, CRM systems, content pipelines, and self-hosted setups using n8n, Make, Zapier, GoHighLevel, HubSpot, ClickUp, Python, PostgreSQL, Docker, OpenAI API, and Claude API. Selected results include saving 15+ hours per week for a marketing agency, reducing estimation time from 20 minutes to under 3 for a construction company, and building a GoHighLevel CRM for 1,600+ contacts. I build systems that are practical, scalable, and ready for production from day one. I work best with teams that need reliable AI automation, cleaner operations, and fast proof-of-concept delivery before moving to full deployment. Clear scope, measurable value, and business impact are stan
Work experience
Glivera-Team
January 2009 – Current time
Job title
Founder
Work experience & achievements
I have 15 years of experience in front-end, which allows me to solve problems of any complexity. Therefore, my customers get a highly efficient website with a good performance, fast loading speed, and excellent SEO optimization.
Education
Kharkiv National University of Radio and Electronics (KHNURE)
Graduation year: 2002
Level of study
Bachelor
Major / Field of study
Computer Programming
No reviews added yet
Persistent Vector Memory for Claude Code: Open-Source MCP Server
AI Automation Engineer with 190 delivered projects, $540K+ earned, and a 100% Job Success Score. I build practical automation systems that reduce manual work, connect fragmented tools, and improve business operations. My focus is on AI agents, API integrations, CRM automation, chatbots, and process automation using n8n, Make, Zapier, OpenAI, Claude, Node.js, and Python. I work with teams that need reliable automations, faster internal workflows, and scalable systems that are ready for production. Selected results include saving 15+ hours per week for a marketing agency, reducing estimation time from 20 minutes to under 3 for a construction company, and building CRM structures for teams managing large contact databases. I build solutions that are clear in scope, fast to deploy, and tied to real business value. I also help businesses replace repetitive tasks with production-ready workflows that support sales, marketing, operations, and client communication across multiple tools and team.
AI Sales Agent Across Voice, Chat, Telegram, WhatsApp (One Pipeline)
Forward-Deployed AI Engineer specializing in multi-channel AI sales automation. I built a unified AI sales agent that qualifies leads across voice calls, website chat, Telegram, and WhatsApp through one shared pipeline instead of four disconnected tools. The system uses one agent, one knowledge base, one lead record, and one scoring layer across all channels. Operators can launch outbound calling campaigns from a dashboard, while the platform autodials contacts, handles live conversations, processes transcripts, and updates lead scores automatically. Built with Claude, Retell AI, Supabase, pgvector, BullMQ, Next.js, Docker, and nginx. Live in partner testing with 7 Docker services, multi-tenant RBAC, and a 53KB embeddable widget. Designed for B2B sales teams that need scalable qualification, consistent answers, and shared intelligence across every customer touchpoint. It replaces fragmented sales tools with one deployable system for qualification, routing, scoring, and follow-up tasks.
Calendar SOP Enforcement: 3 Calendars, Zero Manual Review
Forward-Deployed AI Engineer specializing in workflow automation, policy enforcement, and operational monitoring. I built an n8n-based system for a US investment fund that audits three Google Calendars against strict meeting SOPs with zero manual review. The workflow checks time-of-day restrictions, required buffers, day-type rules, and daily caps every five minutes, then sends only new violations to Slack through a Supabase-backed dedup layer. A daily 8:30 AM audit adds AI rescheduling suggestions and a Friday preview of the coming week. Built with n8n, Supabase, PostgreSQL, Google Calendar API, Slack API, OpenAI API, and MCP. The system eliminated repeated alerts, removed manual calendar review, and gave the operations team a reliable way to enforce scheduling rules across Business, Fund, and Personal calendars. Designed as a serverless operational tool with safe, controlled updates and production-ready audit logic. It turned a fragile process into a trusted, always-on control layer.
AI-Native Control for Worksection: Claude, Cursor, ChatGPT via MCP
Forward-Deployed AI Engineer building AI-native operational systems. I created a production MCP server that gives Claude, Cursor, and ChatGPT direct, authenticated access to Worksection, so teams can create tasks, post comments, log costs, manage projects, and update work without copy-paste. The system runs as a multi-tenant SaaS with OAuth2, encrypted credentials, per-tenant rate limits, and an admin dashboard for tenant management, secret rotation, and usage analytics. Built with Node.js, TypeScript, Express, Supabase, React, Docker, and nginx. Twenty-six MCP tools are live in production across tasks, projects, comments, members, costs, tags, and files. Designed for teams that want AI assistants to work inside their PM stack securely and reliably. It turned Worksection from a manual destination into an AI-controlled execution layer for daily operations across multiple tenants. This removed repetitive admin work and enabled direct AI-driven project actions with secure tenant isolation
AI Assistant for ServiceFusion: Query Jobs by Chat, Not Clicks
Forward-Deployed AI Engineer building AI assistants for field service operations. I created a multi-tenant MCP server that lets Claude and other AI clients query ServiceFusion in plain English instead of forcing dispatchers to click through menus. The platform exposes 13 structured tools for customers, jobs, estimates, technicians, and equipment, with encrypted credential storage, automatic OAuth2 token refresh, per-tenant rate limits, and self-service onboarding. Built with Node.js, TypeScript, Express, Supabase, Docker, and nginx. Designed for operators rolling out AI-assisted workflows across multiple companies without manual tenant setup. Dispatchers can now retrieve customer history, open jobs, estimates, and technician assignments conversationally through one secure AI layer. The result is faster access to operational data, less repetitive admin work, and a production-ready integration that scales cleanly across tenants. Built for secure, reliable daily use by real dispatch teams
Grocery Price Intelligence: 60%+ Basket Savings Across 4 Stores
Forward-Deployed AI Automation Engineer building grocery price intelligence and cart automation systems. I created a full-stack platform that compares one shopping basket across four major Ukrainian grocery chains in real time, then helps assemble the winning cart inside the user’s own Chrome session. The system includes a NestJS API, React dashboard, multi-store price engine, receipt parser, and browser-extension bridge for reliable cart building under real-world site restrictions. Live testing showed a five-item basket at 301 UAH in one store versus 787 UAH in another, a 62% savings on the same basket. Built with NestJS, TypeScript, React, Supabase, Redis, BullMQ, Playwright, Docker, and Chrome extension tooling. Designed for consumer platforms that need fast basket comparison, real cart automation, and repeat-order flows instead of fragile demo scraping. Production-oriented, scalable, and focused on measurable shopper savings. Built to turn price checks into reliable basket savings.
Automated Trading Platform: Years of Strategy Validation in Hours
Forward-Deployed AI Engineer building quant infrastructure and trading automation systems. I created a platform that validates 150+ trading strategies in parallel against years of market data, catches look-ahead bias before deployment, and promotes only statistically sound strategies to live execution. What previously took months of paper trading now takes hours. The system scores setups, runs walk-forward validation, executes live trades on approved signals, and monitors exits continuously. Built with Node.js, Supabase, PostgreSQL, Docker, and exchange APIs. Designed for traders who need evidence-based automation instead of gut-feel decisions and misleading backtests. It replaced manual validation with a production-grade workflow for testing, filtering, and executing strategies with real statistical thresholds, safer live deployment, and continuous monitoring across crypto and equity markets. Built to reduce false confidence, eliminate weak ideas early, and protect capital in trading.
Automated FL Tax Deed Auction Research: Hours of Work, Zero Clicks
Forward-Deployed AI Engineer building self-hosted real estate research automation for Florida tax deed investors. I created an overnight pipeline that replaces hours of manual parcel vetting across county appraiser portals, FEMA flood maps, parcel-boundary tools, and satellite views. The system scrapes auction listings, pulls county property data, resolves GPS coordinates, captures four screenshot types per parcel, and runs Claude Vision to score each property from 0 to 100 with a buy, review, or skip verdict. Built with NestJS, Playwright, BullMQ, Redis, Supabase, React, Docker, and VPN-routed scraping. Designed for investors who need ranked parcel shortlists instead of spending auction days clicking through dozens of tabs. The platform is self-hosted, keeps data and scoring logic under operator control, and delivers a live dashboard with parcel details, screenshots, queue status, exports, and AI-driven recommendations for faster, more consistent auction research. for daily use today.
3M+ Deed Records Processed: Property Research from Days to Minutes
Forward-Deployed AI Engineer building property intelligence systems for legal-tech and real estate research. I created an end-to-end platform that helps attorneys find undervalued land parcels by processing deed records, filtering them by legal language, enriching matches with geo and infrastructure data, and scoring results with AI. The system combines a React SPA frontend with an n8n-as-API backend, Supabase, and Claude-based analysis. It has processed 3M+ deed records and reduced qualified property research from days to minutes. Attorneys can search, review, and download deed matches through one interface instead of manually cross-checking county records, tax maps, demographics, and PDFs across multiple tools. Built for a US legal-tech founder and now being prepared for microSaaS launch. Designed for fast iteration, controlled data workflows, and production use by teams handling complex property research at scale. Built to make hard multi-source land queries practical and repeatable
30 AI Agents, 5 Platforms, One Dashboard: Autonomous Content System
AI Agent Architect and Full-Stack Developer building autonomous content systems for multi-platform publishing. I created a dashboard-driven workflow with 30 AI agents orchestrated through BullMQ to handle research, writing, fact-checking, AI detection, neuromarketing optimization, adaptation, approval, and publishing across WordPress, LinkedIn, Dev.to, Hashnode, and Telegram. The system reduced content production time by about 85%, turning a 4-6 hour manual process into roughly 30 minutes of oversight. Approval runs through a React dashboard or Telegram bot, while analytics feed back into the strategy layer to improve future content automatically. Built with Node.js, TypeScript, BullMQ, Redis, PostgreSQL, React, Telegram bot tooling, and multiple AI APIs. Designed for creators and teams that need scalable, production-ready content operations with one-click multi-platform publishing, consistent quality control, and autonomous execution. Built for reliable scaling, retries, and approval.
37,000 Records Migrated Across 3 Platforms in 2 Weeks Zero Data Loss
AI Automation Engineer and Data Migration Specialist specializing in high-stakes CRM and API migrations. I built a resumable Python pipeline that migrated about 37,000 records across RadiusBob, GoHighLevel, and HealthSherpa for a Medicare insurance brokerage in two weeks with zero data loss. The system handled 1,593 contacts, 1,222 policies, 32,507 notes, and 793 relationships, with contact deduplication, three-pass HealthSherpa matching, and parallel note processing for an 8x speed increase. Built with Python, Docker, Supabase, asyncio, and multiple APIs, the pipeline tracked every record individually, resumed safely after interruptions, and preserved full client history. Results included 100% contacts migrated, 100% notes preserved, 99.3% policies imported, and one-click Medicare enrollment links added to client records. Designed for compliance-sensitive teams that need reliable, auditable migrations across complex systems without manual rework or downtime. Production-ready at scale.
CRM automation - Invoice Collection Automation - ClickUp
Automation Architect specializing in invoice collection and CRM workflow systems for service businesses. I built a three-part automation for a US construction company handling 5,000+ invoices in Service Fusion with no reliable way to track overdue payments. The system synced invoice and estimate data into Smartsheet, created a ClickUp-based collection workflow with seven payment-stage email templates, and ran a 30-minute auto-sync that closed tasks automatically when invoices were paid. A key challenge was that Service Fusion did not directly link invoices to clients, so I designed a programmatic connection through the Jobs table to make tracking possible. The result was the company’s first real collection process, 60 hours per month saved for the accountant, and follow-up time reduced from 15 minutes per invoice to one click. Built for teams that need scalable collections, cleaner data flow, and reliable follow-up without extra headcount. Delivered in 2.5 weeks, fully production-ready
CRM automation - SEO Analytics Data Scraping from SimilarWeb, Ahrefs
n8n Automation Developer and TypeScript engineer specializing in data scraping and reporting pipelines. I built an automation system that collects SEO and web analytics data from SimilarWeb, Ahrefs, and SEMrush without using their paid APIs. A browser-based script extracts the required metrics, converts them into structured JSON, and sends them through a webhook to n8n, where the data is processed and stored in a database for continuous reporting. The system can run on demand or on a schedule, giving the client reliable access to fresh analytics data for recurring reports without expensive API subscriptions. Designed for teams that need scalable data collection, lower tooling costs, and dependable reporting workflows. Built with browser automation, n8n, JavaScript, webhooks, scheduled jobs and database storage. Production-ready for ongoing analytics collection and reporting across multiple SEO data sources. It replaced manual exports with a repeatable pipeline for ready to use SEO data
CRM automation / n8n Automation - API Data Loader
n8n Automation Developer and TypeScript engineer specializing in API integrations, workflow orchestration, and data processing pipelines. I built an automated API data loader that fetches paginated data from external services, manages expiring tokens, enriches records with keywords, and inserts structured results into PostgreSQL without manual effort. The workflow handles scheduled execution, OAuth/token refresh logic, JSON mapping, filtering, item splitting, database insertion, page tracking, and logging for monitoring and debugging. Built in n8n with logic for retries, token updates, and global variable control, the system can process API feeds reliably over time and recover automatically from common failures. Designed for teams that need dependable backend automation for syncing external data into internal systems. It replaced repetitive manual imports with a production-ready workflow for scheduled loading, enrichment, storage, and operational visibility across evolving API sources.
n8n Automation - Voice Files → Text Doc with OpenAI + Notion + Dropbox
n8n Automation Developer specializing in AI-powered document workflows and file processing systems. I built an automation that turns voice files into structured text documents using OpenAI, Notion, and Dropbox. The workflow monitors incoming audio files, processes them automatically, converts speech to text, organizes the output, and creates clean documentation in Notion without manual copy-paste. Built for teams that need faster note handling, searchable records, and reliable document creation from voice input. Designed with n8n, JavaScript, API integrations, and AI transcription workflows, the system reduces repetitive admin work and creates a consistent pipeline from raw audio to usable text. It is suitable for internal knowledge capture, meeting notes, voice memos, and operational documentation. The result is a practical automation that saves time, reduces manual handling, and turns scattered voice recordings into structured, accessible text records ready for daily use across ops.
n8n Automation MCP server calculator for Concrete Cutting from USA
n8n Automation Developer and TypeScript engineer specializing in construction quoting systems and workflow automation. I rebuilt a 300-row concrete cutting pricing spreadsheet into a fast quote engine for a US construction company. Every hidden formula was migrated into a clean API, connected to n8n, and exposed through two simple interfaces: a one-page web calculator and a chat-based assistant. Staff can now enter requests like drilling 5 holes at a given diameter in plain language and receive a priced PDF quote in about 40 seconds instead of spending 15 minutes working through spreadsheet logic manually. The system removed spreadsheet breakpoints, made pricing updates centralized and fast, and helped owners send instant, error-free bids to customers. Built for teams that need faster estimating, cleaner pricing logic, and a scalable workflow that turns field inputs into accurate quotes for daily use. It cut manual effort, improved consistency, and made quote delivery production-ready.
n8n Automation Deed scraping, Browser Automation + n8n, Supabase
n8n Automation Developer specializing in deed scraping, browser automation, and property data pipelines. I built a workflow that captures deed data from the browser through HTTP requests, enriches each record with ZIP-derived location details, county codes, and GPS coordinates, and syncs the final structured data into Notion and Supabase. The system was designed to eliminate manual copy-paste work and turn raw deed information into clean, searchable records for research and operations. Built with n8n, browser automation, HTTP integrations, and database workflows, it supports reliable intake, enrichment, and storage of property data at scale. Designed for teams that need faster deed processing, cleaner records, and a repeatable pipeline from browser extraction to organized databases. It replaces fragmented manual steps with one production-ready automation flow for daily deed data capture, enrichment, sync, and review tasks. Built for repeatable property record handling across all teams.
n8n Automation: Sync Worksection Tasks with HubSpot Deals
n8n Automation Developer specializing in CRM integrations and sales workflow automation. I built a real-time integration between Worksection and HubSpot that creates HubSpot deals automatically from new Worksection tasks, syncs contact data, updates task records, and keeps both systems aligned without manual entry. The workflow processes deal-related information, maps fields between platforms, and ensures sales and project data stay consistent as work moves forward. Built with n8n and API-based automation, the solution reduced repetitive admin work, lowered the risk of human error, and gave the sales team a faster, cleaner handoff between task management and CRM operations. Designed for teams that need reliable cross-platform syncing, accurate deal creation, and scalable automation between internal workflows and customer-facing systems. Production-ready for daily use and built to support cleaner operations as volume grows. It replaced manual copying with a dependable workflow layer ok.
n8n Automation for Smart Email Sorting and Business Workflow
n8n Automation Developer specializing in AI-powered email processing and business workflow automation. I built a smart Gmail automation that classifies incoming emails into categories such as commercial, retail, social, important, and other, then applies labels, removes low-priority messages from the inbox, and logs actions to Google Sheets for reporting and tracking. The workflow was designed to help businesses reduce inbox noise, improve response visibility, and save time on repetitive email handling. Built with n8n, Gmail integrations, AI-based classification, and reporting logic, the system creates a cleaner communication flow and supports more efficient daily operations. Designed for teams that need scalable email triage, better inbox control, and reliable automation around communication management. It replaced manual sorting with a production-ready workflow that keeps email organized, visible, and easier to manage every day. It adds audit visibility to every processed email flow.
n8n Automation for RSS work evaluations
n8n Automation Developer specializing in job feed evaluation and opportunity scoring workflows. I built an automated system that monitors RSS job listings every five minutes, scores each new opportunity through a 10-point keyword and skill-matching model, and routes qualified jobs automatically to Discord and Notion. The workflow also manages and updates tags through HTTP requests to keep tracking accurate and organized over time. Built with n8n, JavaScript, API integrations, Discord, and Notion, the system replaces manual job scanning with a repeatable pipeline for filtering, ranking, and storing relevant opportunities. Designed for teams or solo operators who need faster lead discovery, cleaner opportunity tracking, and consistent evaluation logic. Production-ready for continuous use, with scheduled execution, structured scoring, automated notifications, and database logging for ongoing review and workflow visibility. It saves time, cuts noise, and surfaces the best-fit roles faster.