SAP Ariba and SAP HANA to Azure ADLS Gen 2

Relay 300 Reputation points
2025-06-16T02:24:18.6566667+00:00

Hello,

How to Bring Data from SAP Ariba and SAP HANA to ADLS Gen2.

May someone please help me with reference architecture.

Please help me.

Thanks a lot

SAP HANA on Azure Large Instances
SAP HANA on Azure Large Instances
Microsoft branding terminology for an Azure offer to run HANA instances on SAP HANA hardware deployed in Large Instance stamps in different Azure regions.
0 comments No comments
{count} votes

Accepted answer
  1. Nandamuri Pranay Teja 4,855 Reputation points Microsoft External Staff Moderator
    2025-06-16T05:00:14.79+00:00

    Hello Relay

    To bring data from SAP Ariba and SAP HANA to Azure Data Lake Storage Gen2 (ADLS Gen2), you can use a combination of Azure Data Factory (ADF), SAP connectors because SAP HANA data can be copied to ADLS Gen2 using Azure Data Factory’s SAP HANA connector.

    • In ADF, go to Manage > Linked Services > New.
    • Select SAP HANA.
    • Configure: Server: SAP HANA server address, Port: Default is 3XX15 (replace XX with instance number). Authentication: Use Basic (username/password) or Windows authentication. Integration Runtime: Choose Azure IR or self-hosted IR (for on-premises).
    • Test the connection and save
    • User's image

    Post which in ADF, create a dataset for SAP HANA.

    • Select the SAP HANA linked service.
    • Specify the schema and table or write a custom SQL query to extract data (e.g., from Analytic/Calculation views or Row/Column tables).
    • Go to Linked Services > New > Azure Data Lake Storage Gen2.
    • Configure: Storage Account Name: Select your ADLS Gen2 account. Authentication: Use Account Key, Service Principal, or Managed Identity. Test the connection and save.
    • User's image

    Create a dataset for ADLS Gen2.

    • Specify the file format (e.g., Parquet, CSV) and folder path (e.g., data/saphana/).
    • Create a pipeline in ADF and add a Copy Activity.
    • Set the SAP HANA dataset as the Source.
    • Set the ADLS Gen2 dataset as the Sink.

    SAP Ariba data is typically accessed via APIs (REST or SOAP) or exported reports. Since ADF does not have a native SAP Ariba connector, you can use the HTTP or REST connector or third-party tools like CData Sync.

    1. In ADF, go to Linked Services > New > REST.
    2. Configure: Base URL: SAP Ariba API endpoint (e.g., https://api.ariba.com/v2/), Authentication: OAuth 2.0 with Client Credentials flow. Enter Client ID, Client Secret, and token endpoint.
    3. Create a dataset for the REST service.
    4. Specify the relative URL and parameters for the API call (e.g., /reports for reporting data).

    Post which Add a Copy Activity in an ADF pipeline.

    • Set the REST dataset as the Source.
    • Set the ADLS Gen2 dataset as the Sink.
    • Map the API response fields to the ADLS Gen2 file structure.
    • Use pagination rules if the API returns paginated data.

    Use API filters (e.g., modifiedDate) to fetch only new or updated records. Store the last processed timestamp in a control table or ADLS Gen2 file to track increments.

    References: https://learn.microsoft.com/en-us/azure/data-factory/connector-sap-hana?tabs=data-factory

    Hope the above answer helps! Please let us know do you have any further queries.


    Please do not forget to "Accept the answer” and “up-vote” wherever the information provided helps you, this can be beneficial to other community members. 

    1 person found this answer helpful.

0 additional answers

Sort by: Most helpful

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.