Enterprises that have invested heavily in SAP S/4HANA on‑premise are now facing a strategic crossroads: stay on legacy hardware, move to SAP S/4HANA Cloud, or adopt a hybrid model. The migration journey is notoriously complex—spanning system landscape analysis, custom code remediation, data harmonisation, and post‑go‑live optimisation.
Enter generative AI. Large language models (LLMs) such as SAP‑CoPilot, OpenAI’s GPT‑4, and specialised foundation models trained on SAP‑centric corpora can read, understand, and generate SAP‑specific artefacts at scale. When coupled with SAP’s extensibility framework, these models become powerful assistants that accelerate cloud migration, reduce manual effort, and embed intelligence into day‑to‑day operations.
In this post we’ll explore how generative AI can be harnessed across the entire S/4HANA migration lifecycle, provide concrete, actionable steps you can start using today, and finish with a look at where the technology is headed.
| Phase | Typical Pain Points | Traditional Approach |
|---|---|---|
| Landscape Assessment | Identifying on‑premise custom objects, dependencies, and obsolete components | Manual inventory, spreadsheets, SAP Solution Manager scans |
| Custom Code Remediation | ABAP code that uses deprecated APIs, hard‑coded client numbers, or non‑compatible UI5 components | SAP Code Inspector, ATC runs, manual refactoring |
| Data Migration | Mapping legacy data models to new S/4HANA data structures, handling volume, ensuring data quality | SAP S/4HANA Migration Cockpit, custom scripts |
| Testing & Validation | Creating realistic test data, covering edge cases, regression testing | Manual test case creation, test automation frameworks |
| Operations & Optimisation | Continuous monitoring, anomaly detection, and recommendation for performance tuning | SAP Solution Manager alerts, manual performance reviews |
The common denominator is knowledge‑intensive, repetitive work that consumes months of senior‑consultant time. Generative AI can shift the effort from “manual execution” to “guided orchestration,” freeing resources for higher‑value activities such as business process redesign.
Generative AI can ingest system metadata (e.g., SAPCAR‑extracted transport logs, ATC findings) and produce a concise, visual report that highlights:
Example prompt to an LLM
You are an SAP migration analyst. Based on the following ATC findings, list all custom function modules that call BAPI_PO_CREATE1 and assign a migration risk level (Low/Medium/High) based on usage frequency and parameter complexity.
[Insert ATC JSON payload here]
The model returns a JSON payload that can be fed directly into a dashboard:
{
"custom_functions": [
{
"name": "Z_FM_CREATE_PO",
"calls": ["BAPI_PO_CREATE1"],
"risk": "High",
"usage_count": 1245
},
{
"name": "Z_FM_UPDATE_PO",
"calls": ["BAPI_PO_CHANGE"],
"risk": "Medium",
"usage_count": 342
}
]
}
LLMs trained on ABAP syntax can rewrite deprecated code snippets, replace obsolete APIs, and even generate ABAP Unit tests. Below is a before‑and‑after illustration for a classic SELECT * anti‑pattern that must be eliminated for S/4HANA Cloud.
Original ABAP (legacy)
SELECT *
FROM ekko
INTO TABLE @lt_ekko
WHERE bukrs = @lv_bukrs.
LLM‑generated replacement
SELECT ebeln, bukrs, lifnr, bsart, erdat, aedat
FROM ekko
INTO TABLE @lt_ekko
WHERE bukrs = @lv_bukrs.
The model also adds a performance‑oriented comment and a unit test:
"--- Performance note: SELECT * is prohibited in S/4HANA Cloud.
"--- Use explicit field list to enable push‑down predicate optimisation.
CLASS zcl_ekko_test DEFINITION FOR TESTING RISK LEVEL HARMLESS DURATION SHORT.
PRIVATE SECTION.
METHODS: test_select_all FOR TESTING.
ENDCLASS.
CLASS zcl_ekko_test IMPLEMENTATION.
METHOD test_select_all.
DATA lt_ekko TYPE TABLE OF ekko.
SELECT ebeln, bukrs, lifnr, bsart, erdat, aedat
FROM ekko
INTO TABLE @lt_ekko
WHERE bukrs = '1000'.
cl_abap_unit_assert=>assert_not_initial( act = lines( lt_ekko ) ).
ENDMETHOD.
ENDCLASS.
When moving to S/4HANA Cloud, the underlying CDS (Core Data Services) model often differs from the classic ABAP Dictionary. Generative AI can propose CDS view definitions based on legacy table structures.
Prompt
Generate a CDS view for the legacy table MARA that exposes the fields MATNR, MTART, MATKL, and adds a calculated column `is_active` (Boolean) that is true when LVORM = ''.
LLM Output (CDS)
@AbapCatalog.sqlViewName: 'ZV_MARA'
@AccessControl.authorizationCheck: #NOT_REQUIRED
define view Z_CDS_MARA as select from mara {
key matnr,
mtart,
matkl,
@EndUserText.label: 'Is Active'
case when lvorm = '' then true else false end as is_active
}
The generated view can be directly transported to the cloud system, dramatically shortening the data‑model harmonisation phase.
Creating realistic test data is a bottleneck. By feeding the LLM with a handful of real‑world master‑data examples, it can synthesize synthetic data that respects business rules (e.g., valid customer‑vendor combinations, fiscal year constraints).
{
"template": {
"KUNNR": "CUST{{5-digit}}",
"LAND1": ["DE", "US", "FR"],
"UMSATZ": "float(0, 1000000)",
"ERDAT": "date(past_years=2)"
},
"count": 500
}
The model then produces a CSV ready for upload into the SAP S/4HANA Migration Cockpit.
Post‑go‑live, the cloud environment generates a flood of logs, performance metrics, and alert data. Generative AI can:
Sample configuration (SAP Cloud ALM) generated by AI
# almtasks.yaml
tasks:
- name: "Daily Health Digest"
schedule: "0 7 * * *"
script: |
#!/usr/bin/env python3
import requests, json
token = requests.post("https://auth.example.com/oauth/token", data={"grant_type":"client_credentials"}).json()["access_token"]
logs = requests.get("https://api.cloudalm.example.com/v1/logs?since=24h", headers={"Authorization": f"Bearer {token}"}).json()
summary = ai_summarize(logs) # Calls LLM endpoint
requests.post("https://slack.com/api/chat.postMessage",
json={"channel":"#ops-digest","text":summary},
headers={"Authorization": f"Bearer {SLACK_BOT_TOKEN}"})
Below is a step‑by‑step playbook you can adopt immediately. Each step includes a concrete artifact you can copy‑paste into your project repository.
# ai_service.py
import os, json, httpx
API_KEY = os.getenv("AI_API_KEY")
BASE_URL = "https://api.openai.com/v1/chat/completions"
def ask(prompt: str, temperature: float = 0.2) -> str:
payload = {
"model": "gpt-4o",
"messages": [{"role": "user", "content": prompt}],
"temperature": temperature,
"max_tokens": 1500
}
headers = {"Authorization": f"Bearer {API_KEY}"}
response = httpx.post(BASE_URL, json=payload, headers=headers, timeout=30)
response.raise_for_status()
return response.json()["choices"][0]["message"]["content"]
Export ATC findings, transport logs, and SAP Solution Manager data to JSON files. Example command for ATC:
# Export ATC findings to JSON (requires SAP Cloud SDK)
sapcli atc export --system=PRD --output=atc_findings.json
# generate_report.py
import json
from ai_service import ask
with open("atc_findings.json") as f:
atc_payload = json.load(f)
prompt = f"""
You are an SAP migration analyst. Based on the following ATC findings, list all custom function modules that call BAPI_PO_CREATE1 and assign a migration risk level (Low/Medium/High) based on usage frequency and parameter complexity.
{json.dumps(atc_payload, indent=2)}
"""
report = ask(prompt)
print(report) # Pipe into a markdown file or PowerBI
Create a batch processor that reads custom objects, sends them to the LLM for transformation, and writes the result back to an ABAP repository (e.g., Git for ABAP).
" batch_refactor.abap
REPORT z_batch_refactor.
DATA: lt_objects TYPE TABLE OF seoclass,
lo_ai TYPE REF TO zcl_ai_client.
SELECT * FROM seoclass INTO TABLE lt_objects WHERE devclass = 'ZCUSTOM'.
lo_ai = NEW zcl_ai_client( iv_api_key = '<<YOUR_KEY>>' ).
LOOP AT lt_objects INTO DATA(ls_obj).
DATA(lv_source) = zcl_abap_reader=>read( iv_name = ls_obj-clsname ).
DATA(lv_prompt) = |Refactor the following ABAP class to be S/4HANA Cloud‑compatible:\n{ lv_source }|.
DATA(lv_refactored) = lo_ai->ask( iv_prompt = lv_prompt ).
zcl_abap_writer=>write( iv_name = ls_obj-clsname iv_code = lv_refactored ).
ENDLOOP.
Tip: Run the batch on a non‑productive branch, review diff in Git, and approve via pull‑request before transport.
Create a CSV that maps legacy tables to target CDS view specifications. Feed each row to the LLM using the prompt template shown earlier, and output a .cds file per view.
python generate_cds.py --mapping legacy_to_cds.csv --output ./cds/
Use the JSON template generator (Step 3) to produce synthetic master data, then load it via the SAP S/4HANA Migration Cockpit.
python synth_data.py --template test_template.json --count 1000 > synthetic_customers.csv
# In the Migration Cockpit: Upload CSV → Map → Activate
Add the Daily Health Digest (shown earlier) to your CI/CD pipeline. Ensure the bot runs under a service account with read‑only access to Cloud ALM APIs.
# .github/workflows/health-digest.yml
name: Daily Health Digest
on:
schedule:
- cron: '0 7 * * *' # 07:00 UTC daily
jobs:
digest:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- name: Run Digest Script
env:
AI_API_KEY: ${{ secrets.AI_API_KEY }}
SLACK_BOT_TOKEN: ${{ secrets.SLACK_BOT_TOKEN }}
run: |
python scripts/daily_digest.py
Company: Global Manufacturing Corp (GMC)
Scope: 120 custom ABAP objects, 2 TB of master data, 3 SAP landscapes (ECC → S/4HANA Cloud).
Outcome:
- Landscape assessment completed in 2 weeks (vs. 8 weeks).
- ABAP remediation time reduced from 12 weeks to 5 weeks.
- Data migration scripts generated automatically; data‑validation defects fell from 312 to 27.
- Post‑go‑live support tickets decreased by 30 % thanks to AI‑driven health digests.
The secret sauce? A closed‑loop workflow where each AI‑generated artefact was version‑controlled, peer‑reviewed, and fed back into the model as “ground‑truth” for subsequent iterations.
| ✅ | Insight |
|---|---|
| Accelerate discovery – LLMs turn raw ATC logs into actionable risk matrices in minutes. | |
| Automate remediation – ABAP refactoring, CDS view creation, and unit‑test scaffolding can be generated at scale, cutting manual effort by >50 %. | |
| Synthetic data on demand – Rule‑based prompts produce high‑quality test data without exposing sensitive production records. | |
| Operational intelligence – Daily AI‑summarised digests keep cloud teams proactive rather than reactive. | |
| Governance matters – Store prompts, responses, and API keys securely; enforce review gates before transporting AI‑generated code. | |
| Iterative improvement – Feed corrected artefacts back into the model to continuously raise output quality. |
Domain‑Specific Foundation Models – SAP is investing in models pre‑trained on billions of SAP‑specific artefacts (ABAP, CDS, BOPF, Fiori). Expect higher fidelity, fewer hallucinations, and built‑in compliance checks.
AI‑Driven Process Mining Integration – Combining generative AI with SAP Process Mining will enable automatic identification of “golden‑path” migration steps, suggesting optimal data‑model transformations without human intervention.
Embedded Copilot in SAP Business Technology Platform (BTP) – Future releases will expose a Copilot‑as‑a‑Service directly inside the BTP cockpit, allowing you to invoke AI actions (e.g., “Create a migration‑ready CDS view for table X”) via a UI button.
Zero‑Touch Migration – By the end of 2027, the vision is a fully autonomous migration pipeline: AI scans the source system, rewrites code, generates migration objects, validates with synthetic data, and orchestrates the cut‑over with minimal human touch.
Responsible AI Governance – As AI becomes integral to core ERP transformations, SAP will provide a trust‑layer (audit trails, model‑explainability, bias detection) to satisfy regulatory and audit requirements.
Bottom line: Generative AI is moving from an experimental helper to a core competency in SAP S/4HANA cloud migration. Early adopters who embed AI into their migration playbooks will not only accelerate timelines but also unlock a new level of operational intelligence that extends far beyond go‑live.
Migration is no longer a “big‑bang” project that stalls your business for months. By leveraging generative AI, you can turn the migration into a continuous, data‑driven transformation that delivers immediate value, reduces risk, and sets the stage for intelligent, self‑optimising operations in the cloud. The tools are ready, the models are maturing, and the opportunity is yours to seize.
Happy migrating!

SAP Expert and Training Specialist with 6+ years of experience. Helped 500+ professionals advance their SAP careers.