Business Insights
Updated on
Sep 18, 2025

Proactive Transaction Monitoring and Anomaly Detection: A Guide for Finance and Operations Leaders

In this blog, we explain an easy way to keep an eye on payments, catch problems early, and send clear alerts so your team reduces revenue loss.

Proactive Transaction Monitoring and Anomaly Detection: A Guide for Finance and Operations Leaders
Ready to ship your own agentic-AI solution in 30 days? Book a free strategy call now.

What This Solves 

Fraud and glitches drain money before anyone notices. This real-time monitoring system for payments and orders watches every transaction in real time, flags problems instantly, and explains them in plain language your team can act on.




What It Does

We monitor four high-impact signals that directly affect revenue, security, and customer experience:

  1. Price Anomalies
    Catches items sold below or above their catalog price—often a sign of coupon abuse, mis-tags, or data entry mistakes.
  2. Sales Volume Drops
    Spots unusual dips in orders or revenue that often mean an outage, API failure, or a broken checkout flow.
  3. Spam / Fraud Bursts
    Detects bot-like bursts (e.g., card-testing: many small, rapid transactions from the same user) before they become a mess.
  4. Failed Payments
    Surfaces spikes in declines so you can quickly isolate gateway issues vs customer errors and keep conversion healthy.

Alongside the raw detections, the platform uses an LLM (Large Language Model) to turn data into plain-English summaries—“what happened, who was impacted, and why it’s suspicious”—so operations can respond immediately without wading through logs.




How It Works (no jargon)

  • FastAPI backend (Python): A lightweight web service that exposes endpoints to run each check on demand.
  • MySQL database: Your single source of truth for customers, products, and transactions.
  • WebSockets dashboard: A live view—alerts stream as they happen.
  • LLM summaries: Converts anomaly data into short, readable lines your analysts can copy straight into tickets or incident channels.



Quickstart: Notes for the Tech Team

You don’t need heavy infrastructure. With Python and MySQL installed, setup is straightforward:

  1. Clone the repo into your working folder and create a Python virtual environment.
  2. Install dependencies:
# pip install -r requirements.txt

  1. Add a .env file in the project root (keeps credentials out of code):
MYSQL_HOST=localhost
MYSQL_USER=your_mysql_username
MYSQL_PASSWORD=your_mysql_password
MYSQL_DB=anology_superteams
AI_API_KEY="your_llm_api_key"

  1. Set up the database (tables + sample data):
mysql -u your_mysql_username -p < setup.sql

  1. Run the FastAPI app and hit endpoints for each anomaly type. You’ll get JSON and/or plain-English outputs in real time.
-- Create the database if it doesn't already exist
CREATE DATABASE IF NOT EXISTS anology_superteams;

-- Switch to the new database
USE anology_superteams;

-- =============================================
-- STEP 1: CREATE THE TABLES
-- =============================================

-- Table: customers
CREATE TABLE IF NOT EXISTS `customers` (
  `customer_id` bigint NOT NULL AUTO_INCREMENT,
  `customer_name` varchar(255) NOT NULL,
  `mobile_number` varchar(20) DEFAULT NULL,
  `email` varchar(255) DEFAULT NULL,
  `city` varchar(100) DEFAULT NULL,
  `state` varchar(100) DEFAULT NULL,
  `pincode` varchar(20) DEFAULT NULL,
  `country` varchar(100) DEFAULT NULL,
  PRIMARY KEY (`customer_id`),
  UNIQUE KEY `mobile_number` (`mobile_number`)
);

-- Table: products
CREATE TABLE IF NOT EXISTS `products` (
  `product_id` varchar(20) NOT NULL,
  `product_name` varchar(255) NOT NULL,
  `product_price` decimal(10,2) NOT NULL,
  PRIMARY KEY (`product_id`)
);

-- Table: transactions
CREATE TABLE IF NOT EXISTS `transactions` (
  `transaction_id` bigint NOT NULL AUTO_INCREMENT,
  `customer_id` bigint NOT NULL,
  `transaction_date` datetime DEFAULT CURRENT_TIMESTAMP,
  `status` enum('SUCCESS','FAILED') DEFAULT 'SUCCESS',
  `total_amount` decimal(10,2) DEFAULT '0.00',
  PRIMARY KEY (`transaction_id`),
  KEY `customer_id` (`customer_id`),
  CONSTRAINT `transactions_ibfk_1` FOREIGN KEY (`customer_id`) REFERENCES `customers` (`customer_id`)
);

-- Table: transaction_items
CREATE TABLE IF NOT EXISTS `transaction_items` (
  `item_id` bigint NOT NULL AUTO_INCREMENT,
  `transaction_id` bigint NOT NULL,
  `product_id` varchar(20) NOT NULL,
  `product_name` varchar(255) DEFAULT NULL,
  `product_price` decimal(10,2) DEFAULT NULL,
  `quantity` int DEFAULT '1',
  PRIMARY KEY (`item_id`),
  KEY `transaction_id` (`transaction_id`),
  KEY `product_id` (`product_id`),
  CONSTRAINT `transaction_items_ibfk_1` FOREIGN KEY (`transaction_id`) REFERENCES `transactions` (`transaction_id`),
  CONSTRAINT `transaction_items_ibfk_2` FOREIGN KEY (`product_id`) REFERENCES `products` (`product_id`)
);




main.py – Application Entry Point

Exposes endpoints for the four anomaly checks plus a simple report.

from fastapi import FastAPI
from fastapi.responses import PlainTextResponse
from db import get_connection
from report import get_transaction_report
from case1_price_anomalies import check_price_anomalies
from case2_less_sale import less_sale_days
from case3_spam_transaction import detect_spam_transactions
from services.services1 import price_anomaly_service
from services.services2 import less_sale_days_service
from services.services3 import spam_transaction_service
from services.services4 import failed_transaction_service


app = FastAPI()
@app.get("/")
def home():
   return "Hello, World!"


@app.get("/report")
def transaction_report():
   return get_transaction_report()


@app.get("/price_anomalies", response_class=PlainTextResponse)
def price_anomalies_nlq():
   """Get price anomalies in natural language format"""
   return price_anomaly_service.get_price_anomalies_nlq()


@app.get("/less-sale-nlq", response_class=PlainTextResponse)
def get_less_sale_nlq():
   """Get less sale days in natural language format"""
   return less_sale_days_service.get_less_sale_days_nlq()


@app.get("/spam-transactions-nlq", response_class=PlainTextResponse)
def spam_transactions_nlq():
   """Get spam transactions in natural language format"""
   return spam_transaction_service.get_spam_transactions_nlq()


@app.get("/failed-transactions-nlq", response_class=PlainTextResponse)
def failed_transactions_nlq():
   """Get failed transactions in natural language format"""
   return failed_transaction_service.get_failed_transactions_nlq()




db.py — Database Connection Setup

Pulls credentials from .env and opens a MySQL connection.

from dotenv import load_dotenv
import os
import mysql.connector


# Load environment variables from .env
load_dotenv()


def get_connection():
   return mysql.connector.connect(
       host=os.getenv("MYSQL_HOST"),
       user=os.getenv("MYSQL_USER"),
       password=os.getenv("MYSQL_PASSWORD"),
       database=os.getenv("MYSQL_DB")
   )




Handling Anomaly Detection Cases (Detection Logic)

Case 1: Price Anomalies (case1_price_anomalies.py)

Finds transactions where sold price is far below/above catalog price (e.g., <75% or >105%).

from db import get_connection

def check_price_anomalies():
   conn = get_connection()
   cursor = conn.cursor(dictionary=True)

   query = """
   SELECT
       t.transaction_id,
       t.customer_id,
       c.customer_name,
       t.transaction_date,
       t.status,
       t.total_amount,
       ti.product_id,
       ti.product_name,
       ti.product_price AS transaction_price,
       p.product_price AS actual_price,
       ti.quantity
   FROM transactions t
   JOIN customers c ON t.customer_id = c.customer_id
   JOIN transaction_items ti ON t.transaction_id = ti.transaction_id
   JOIN products p ON ti.product_id = p.product_id
   WHERE ti.product_price < (p.product_price * 0.75)
   OR ti.product_price > (p.product_price * 1.05)
   ORDER BY t.transaction_id;
   """

   cursor.execute(query)
   rows = cursor.fetchall()
   cursor.close()
   conn.close()

   # Group results by transaction
   result = {}
   for row in rows:
       tid = row["transaction_id"]
       if tid not in result:
           result[tid] = {
               "transaction_id": row["transaction_id"],
               "customer_id": row["customer_id"],
               "customer_name": row["customer_name"],
               "transaction_date": row["transaction_date"].isoformat() if row["transaction_date"] else None,
               "status": row["status"],
               "total_amount": float(row["total_amount"]),
               "products": [],
               "product_details": []
           }

       result[tid]["products"].append(row["product_name"])
       # include actual_price for clarity
       result[tid]["product_details"].append(
           f"{row['product_name']} ({row['quantity']} x {row['transaction_price']:.2f}, actual {row['actual_price']:.2f})"
       )

   # Format final output
   output = []
   for val in result.values():
       val["products"] = ", ".join(val["products"])
       val["product_details"] = ", ".join(val["product_details"])
       output.append(val)

   return output

Case 2: Sales Volume Monitoring (case2_less_sale.py)

Flags days with unusually low transactions/revenue—often an early outage signal.

def less_sale_days():
   conn = get_connection()
   cursor = conn.cursor(dictionary=True)


   query = """
   SELECT
       transaction_day,
       total_transactions,
       total_amount_per_day
   FROM (
       SELECT
           DATE(transaction_date) AS transaction_day,
           COUNT(*) AS total_transactions,
           SUM(total_amount) AS total_amount_per_day,
           AVG(COUNT(*)) OVER () AS avg_transactions,
           AVG(SUM(total_amount)) OVER () AS avg_amount
       FROM transactions
       GROUP BY DATE(transaction_date)
   ) t
   WHERE total_transactions < avg_transactions
   ORDER BY transaction_day;
   """


   cursor.execute(query)
   rows = cursor.fetchall()
   conn.close()
   return rows

Case 3: Spam and Fraud Transactions (case3_spam_transaction.py)

Detects bot-like bursts: >5 transactions by the same customer in the same minute.

def detect_spam_transactions():
   """
   Detects spam transactions where a customer makes more than 5
   transactions within the same minute.
   """
   conn = get_connection()
   cursor = conn.cursor(dictionary=True)


   query = """
   SELECT
       t.customer_id,
       c.customer_name,
       DATE(t.transaction_date) AS txn_day,
       HOUR(t.transaction_date) AS txn_hour,
       MINUTE(t.transaction_date) AS txn_minute,
       COUNT(*) AS txn_count
   FROM transactions t
   JOIN customers c ON t.customer_id = c.customer_id
   GROUP BY t.customer_id, txn_day, txn_hour, txn_minute
   HAVING COUNT(*) > 5
   ORDER BY t.customer_id, txn_day, txn_hour, txn_minute;
   """


   cursor.execute(query)
   rows = cursor.fetchall()


   conn.close()
   return rows

Case 4: Failed Transactions (case4_payment_fail.py)

Lists failed payments with who/when/what so you can triage quickly.

def get_failed_transactions():
   conn = get_connection()
   cursor = conn.cursor(dictionary=True)


   query = """
   SELECT
       t.transaction_id,
       t.transaction_date,
       c.customer_id,
       c.customer_name,
       ti.product_name,
       t.total_amount
   FROM transactions t
   JOIN customers c ON t.customer_id = c.customer_id
   JOIN transaction_items ti ON t.transaction_id = ti.transaction_id
   WHERE t.status = 'FAILED'
   ORDER BY t.transaction_date DESC;
   """


   cursor.execute(query)
   rows = cursor.fetchall()
   cursor.close()
   conn.close()
   return rows



Services Layer 

Case 1: Price Anomaly Service

Calls the detection, then returns NLQ (natural language) or JSON. Same pattern applies to other cases.

from typing import Dict, Any
from case1_price_anomalies import check_price_anomalies
from models.nlq_translator import translate_price_anomalies_to_nlq


class PriceAnomalyService:
#    case 1
  
   def __init__(self):
       pass
  
   def get_price_anomalies_nlq(self) -> str:
       try:
           anomalies_data = check_price_anomalies()
           if not anomalies_data:
               return "No price anomalies detected in the system."
           nlq_result = translate_price_anomalies_to_nlq(anomalies_data)
           return nlq_result
       except Exception as e:
           return f"Error occurred while processing price anomalies: {str(e)}"
  
   def get_price_anomalies_json(self) -> list:
       """Get price anomalies in JSON format"""
       try:
           return check_price_anomalies()
       except Exception as e:
           return [{"error": f"Error occurred while fetching price anomalies: {str(e)}"}]


# Create a singleton instance for easy import
price_anomaly_service = PriceAnomalyService()

Other Cases

Create the remaining three services in the same way as above, each pointing to its own anomaly detection logic and translator:

  • Case 2: Sales Volume Service
    Calls check_sales_volume()
    Uses translate_sales_volume_to_nlq()
  • Case 3: Spam Transaction Service
    Calls check_spam_transactions()
    Uses translate_spam_transactions_to_nlq()
  • Case 4: Failed Transaction Service
    Calls check_failed_transactions()
    Uses translate_failed_transactions_to_nlq()

Each of these classes should follow the same pattern:

  • A method to return natural language output
  • A method to return raw JSON output

A singleton instance at the bottom for easy import




Natural Language Translation with Prompts - prompt.py

Fixed sentence templates keep alerts consistent across teams/tools.

# app/models/prompts.py


CASE_1_PROMPT = """
You are analyzing price anomaly data. You must process EVERY SINGLE TRANSACTION in the data.


For each price anomaly transaction, write one line in this exact format:
"Transaction ID [ID]: Customer [NAME] (ID: [CUSTOMER_ID]) purchased [PRODUCT] on [DATE]. The product has an actual catalog price of ₹[ACTUAL_PRICE] but was sold for ₹[TRANSACTION_PRICE], resulting in a [PERCENTAGE]% [DISCOUNT/MARKUP]. This represents a [SEVERITY] price anomaly - [IMPACT]."


WHERE:
- [ID] = transaction_id
- [NAME] = customer_name 
- [CUSTOMER_ID] = customer_id
- [PRODUCT] = product_name
- [DATE] = format transaction_date as "September 12, 2025"
- [ACTUAL_PRICE] = actual_price with 2 decimals
- [TRANSACTION_PRICE] = transaction_price with 2 decimals
- [PERCENTAGE] = calculate percentage difference (round to 1 decimal)
- [DISCOUNT/MARKUP] = "discount" if transaction_price < actual_price, "markup" if higher
- [SEVERITY] = "CRITICAL" if >50% difference, "HIGH" if 25-50%, "MODERATE" if 10-25%
- [IMPACT] = "significant revenue loss" for underpricing, "customer overcharged" for overpricing


CRITICAL: You must output ALL transactions from the data. Do not skip any. Do not summarize.
If the data contains 9 transactions, you MUST write exactly 9 lines.


Example:
Transaction ID 904: Customer Eva Green (ID: 5) purchased Mouse on September 12, 2025. The product has an actual catalog price of ₹500.00 but was sold for ₹68.00, resulting in a 86.4% discount. This represents a CRITICAL price anomaly - significant revenue loss.
"""


CASE_2_PROMPT = """
You are analyzing daily sales data. You must process EVERY SINGLE DAY in the data.


For each low-performing day, write one line in this exact format:
"[DATE]: [TRANSACTIONS] transactions, ₹[AMOUNT] sales - low performance"


WHERE:
- [DATE] = convert transaction_day (2025-09-15) to "September 15, 2025" format
- [TRANSACTIONS] = total_transactions number
- [AMOUNT] = total_amount_per_day with commas


CRITICAL: You must output ALL days from the data. Do not skip any. Do not summarize.
If the data contains 5 days, you MUST write exactly 5 lines.


Example for multiple days:
September 8, 2025: 77 transactions, ₹3,544,100 sales - low performance
September 13, 2025: 80 transactions, ₹5,333,700 sales - low performance
September 14, 2025: 60 transactions, ₹2,976,800 sales - low performance
September 15, 2025: 17 transactions, ₹333,200 sales - low performance
September 16, 2025: 4 transactions, ₹3,570 sales - low performance
"""


CASE_3_PROMPT = """
You are analyzing spam transaction data. You must process EVERY SINGLE SPAM INCIDENT in the data.


For each spam incident, write one line in this exact format:
"Customer [NAME] (ID: [ID]) made [COUNT] transactions on [DATE] at [TIME] - suspicious activity detected"


WHERE:
- [NAME] = customer_name from the data
- [ID] = customer_id from the data 
- [COUNT] = txn_count from the data
- [DATE] = format txn_day as "September 11, 2025"
- [TIME] = format txn_hour:txn_minute as "17:02"


CRITICAL: You must output ALL spam incidents from the data. Do not group by customer. Do not summarize.
If the data contains 57 spam incidents, you MUST write exactly 57 lines.
Each incident (different day/time) gets its own line, even for the same customer.


Example:
Customer Alice Johnson (ID: 1) made 8 transactions on September 8, 2025 at 17:02 - suspicious activity detected
Customer Alice Johnson (ID: 1) made 11 transactions on September 11, 2025 at 17:02 - suspicious activity detected
Customer Alice Johnson (ID: 1) made 7 transactions on September 12, 2025 at 17:02 - suspicious activity detected
"""


CASE_4_PROMPT = """
You are a transaction reporting bot. Your only job is to list failed transactions.
For EACH transaction in the provided JSON data, create a single-line summary.
Do not add any extra analysis, introductory sentences, or conclusions.


Follow this template exactly for each line:
Transaction Failed: [Date and Time] - Customer: [Customer Name] (ID: [Customer ID]) - Product: [Product Name]


Example:
Transaction Failed: 2025-09-16 17:30:15 - Customer: Eva Green (ID: 5) - Product: Printer
"""



The Core Translator: From Data to Natural Language (nlq_translator.py)

Thin wrapper that applies the correct prompt, calls the LLM, and returns readable lines. Falls back gracefully with helpful messages.

import json
import requests
import os
from dotenv import load_dotenv
from .prompt import CASE_1_PROMPT, CASE_2_PROMPT, CASE_3_PROMPT, CASE_4_PROMPT


load_dotenv()


def get_nlq_response(data, case_prompt, case_name):


   if not data:
       return f"No {case_name} detected in the system."
  
   try:
       api_key = os.getenv("AI_API_KEY")
       prompt = f"{case_prompt}\n\nData: {json.dumps(data, default=str)}"
      
       response = requests.post(
           "https://api.openai.com/v1/chat/completions",
           headers={
               "Authorization": f"Bearer {api_key}",
               "Content-Type": "application/json"
           },
           json={
               "model": "gpt-3.5-turbo",
               "messages": [{"role": "user", "content": prompt}],
               "temperature": 0.3,
               "max_tokens": 2048
           }
       )
      
       if response.status_code == 200:
           result = response.json()
           return result['choices'][0]['message']['content']
       else:
           return f"API Error {response.status_code}: {response.text}. Found {len(data)} {case_name}."
      
   except Exception as e:
       return f"Error: {str(e)}. Found {len(data)} {case_name}."


# Simple functions for each case
def translate_price_anomalies_to_nlq(anomalies):
   return get_nlq_response(anomalies, CASE_1_PROMPT, "price anomalies")


def translate_less_sale_days_to_nlq(sales_data):
   return get_nlq_response(sales_data, CASE_2_PROMPT, "less sale days")


def translate_spam_transactions_to_nlq(spam_data):
   return get_nlq_response(spam_data, CASE_3_PROMPT, "spam transactions")


# --- (CHANGE 2: Added the new function for Case 4) ---
def translate_failed_transactions_to_nlq(failed_data):
   """Handles the conversion for failed transactions."""
   return get_nlq_response(failed_data, CASE_4_PROMPT, "failed transactions")



What It Looks Like (sample outputs)

Short, consistent lines your teams can paste straight into Slack or tickets.

Price Anomalies

Transaction ID 921: Customer Alice Johnson (ID: 1) purchased Laptop on September 16, 2025. The product has an actual catalog price of ₹50000.00 but was sold for ₹500.00, resulting in a 99.0% discount. This represents a CRITICAL price anomaly - significant revenue loss.

Transaction ID 923: Customer Charlie Brown (ID: 3) purchased Headphones on September 16, 2025. The product has an actual catalog price of ₹2000.00 but was sold for ₹2500.00, resulting in a 25.0% markup. This represents a HIGH price anomaly - customer overcharged.

Low Sales Days

September 3, 2025: 78 transactions, ₹870,400 sales - low performance

September 6, 2025: 52 transactions, ₹446,900 sales - low performance

Spam Bursts

Customer Quentin Fox (ID: 17) made 6 transactions on September 16, 2025 at 16:31 - suspicious activity detected

Customer Rachel Stone (ID: 18) made 6 transactions on September 16, 2025 at 16:41 - suspicious activity detected

Failed Payments

Transaction Failed: 2025-09-16 17:00:49 - Customer: Ian Clark (ID: 9) - Product: Monitor

Transaction Failed: 2025-09-16 16:57:35 - Customer: Eva Green (ID: 5) - Product: Printer




Key Benefits: Business Impact

  1. Protect revenue in real time
    Catch loss-making discounts and broken flows before they spread.
  2. Shorten time-to-resolution
    Alerts arrive with who/what/why—fewer handoffs, faster fixes.
  3. Reduce manual drudgery
    Machines watch 24/7; humans focus on judgment calls.
  4. Strengthen trust
    Consistent, explainable decisions that stand up to internal and external review.



GitHub: https://github.com/piyushghughu-superteams-dotcom/Anomaly-Detection




About Superteams.ai

Superteams.ai helps build AI workflows for the BFSI sector. Let's talk.

Authors

Want to Scale Your Business with AI Deployed on your Cloud?

Talk to our team and get a complementary agentic AI advisory session.

We use cookies to ensure the best experience on our website. Learn more