Quantcast
Channel: SCN : Blog List - Information Lifecycle Management
Viewing all 30 articles
Browse latest View live

SAP Insider Article on SAP ILM 702

$
0
0

This SAP Insider article highlights the most important features and enhancements of the new SAP NetWeaver Information Lifecycle Management 702 release. It comes straight from the core team that conceived and developed this new release, and provides you with a quick but yet comprehensive overview of what the new product brings along.

So take a look yourself to learn how you can benefit most from SAP ILM 702, which is currently in ramp-up until mid May 2011!


Archiving Processes using archive object BPM_PROC in SAP NetWeaver CE 7.20

$
0
0

I was recently asked if there was a way to archive all of the completed or canceled processes in our SAP NetWeaver CE 7.2 System.  I did some quick research and found out that this is possible by implementing archive object BPM_PROC.  As this was going to be the first time I had attempted XML Archiving, I needed some additional help/information.  I found the following whitepaper out on SDN:

Process Archiving using NetWeaver Business Process Management by Chembrakalathil Venugopal

as well as information out in the SAP Help Library:

Process Data Archiving

Administration of the XML Data Archiving Service

Java Archiving Cockpit

I was now ready to begin.  A decision to use a filesystem to store the archive files instead of a WebDAV system had already been made.  So, a filesystem was created – make sure this filesystem is made available to all relative servers.  (See SAP Help for the Java Archiving Cockpit.)

Next step:  Create an archive administrator user id that will be used to archive the data.  Go to SAP NetWeaver Administration -> Operation Management -> Users and Access -> Identity Management -> Create User

The necessary roles that need to be assigned are:

  • SAP_ARCH_XMLDAS_VIEW
  • SAP_ARCH_SUPERADMIN
  • NWA_SUPERADMIN
  • XMLDASSecurityRole

When creating the user for the Archive Administration, be sure to select “Technical User” for the Security Policy.  This user name is needed for the next step.

SNAG-0813.jpg

 

Next step:  Configure your Java System environment.  This is done by configuring the DAS (Data Archiving Service) Destination.  Go to SAP NetWeaver Administration -> Configuration Management -> Security -> Destinations

And configure/edit DASdefault.  Verify the URL is correct and then on the Logon Data tab, change the Authentication to Basic (User ID and Password) and then enter the user ID and password you created in Step 1.

SNAG-0803.jpg

Next step:  Configure the XML DAS (Data Archiving Service). 

Go to SAP NetWeaver Administration -> Operation Management -> Data and Databases -> XML DAS Administration

SNAG-0811.jpg

Go to the Archive Store Management table and click on Define to create an Archive Store.

SNAG-0814.jpg

Enter the name, description, store type (File System or WebDAV System).  This example will use a file system, so the path that was previously created is entered.

Once you have saved the new Archive Store, select it and click on Test Selected to make sure the configuration is correct.

SNAG-0815.jpg

Next step:  Go to the Home Path Synchronization tab to Create New Home Collection.  For the Home Path, enter /<system name>/bpm_proc/ and then select the relative Archive Store from the drop down.  And click on Start.

SNAG-0816.jpg

Once a Home Collection has been created, you can now view the Archive Hierarchy & Store Assignment information:

SNAG-0824.jpg

The system is now ready for archiving.  You can archive processes from either the Java Archiving Cockpit:

SNAG-0811 (1).jpg

Write tab:

SNAG-0818.jpg

Or from SAP NetWeaver Administrator -> Availablity and Performance Management -> BPM System Overview ( select either the Completed or Canceled Processes):

SNAG-0820.jpg

And then click on the Archive button:

SNAG-0821.jpg

You have the following parameters by which you can archive Processes:

SNAG-0818.jpg

In the Selection Criteria, you can enter a time period or from the Manage Processes screen, you can manually select/highlight the entries you wish to archive.  In the Technical Settings, you do have the option to run the archive job in test/simulation mode.  You can also change the Comment to something more meaningful.  In the Schedule Settings, you can start the job immediately or enter a date and time. 

When you click on Start Write Session, the following message is displayed:

SNAG-0822.jpg

You can also choose to start the delete job immediately, or you can manually schedule it via the Delete tab:

SNAG-0825.jpg

Once the archive job has been submitted, you can check the status on the Archiving Monitor tab:

SNAG-0823.jpg

Viewing the archived Processes.

** Note that you will need some type of BPM role assigned such as SAP_BPM_SuperDisplay.

Then, from within Manage Processes: Process Instances, click on the Advanced link:

SNAG-0826.jpg

Select "Archived" from the Advanced drop down and click on Go:

SNAG-0827.jpg

You will know you are viewing archived data if the Archived checkbox is selected within the Details of the relative Process:

SNAG-0828.jpg

SAP NetWeaver ILM 7.02 Now Generally Available

$
0
0

Benefit from More Automation, Industry Coverage, and Predelivered Content

With the successful completion of its ramp-up on May 12, 2011, SAP NetWeaver Information Lifecycle Management 7.02 is now generally available to all SAP customers. 

Without exaggeration one can say that SAP NetWeaver ILM 7.02 is a huge leap forward compared to SAP ILM 7.01, the first release of SAP’s information lifecycle management solution. The new release comes with a long list of new features and enhancements that will considerably benefit you in managing your data volumes and retention times for both live and legacy systems. This makes us indeed very proud of being able to offer such great value to our customers!

So what can you expect from this brand new release? Simply a lot! 

The main focus in developing the new release was on further enhancing process automation, for instance, in system decommissioning; expanding the ILM coverage to include industry solutions SAP Utilities and SAP Oil & Gas; and providing additional predelivered content, such as queries for easier business warehouse reporting. But prior to all of that we ran an extensive roll-in project to collect feedback from customers and partners to ensure that the new capabilities are exactly what our users need.

Some Highlights

It’s not the goal of this blog to provide a full-blown list of all new or enhanced features. That is something you would rather expect to find in the product documentation. Instead I would prefer to single out some real highlights to give you a first impression of the new release.

Audit Areas

Using so-called “audit areas” ILM lets you categorize your data according to its retention reason to comply with regulatory requirements. Basically, audit areas are used to select the data for a particular audit purpose, e.g. tax or product liability. From a technical perspective, an audit area contains and covers all data types needed for that audit, including, for example, a list of necessary object types that are assigned to the audit area, and relevant structures, tables, and fields.

Audit areas help you automate the data selection process for reporting on legacy data, and eliminate the manual selection of archive files from the store. SAP predelivers the audit areas TAX and PRODUCT LIABILITY, which you can use as a template for defining your own specific audit areas in the customer name space. This can be performed using transaction ILMARA:

 

 

Automation for System Decommissioning

Using transaction ILM_TRANS_ADMIN you can now automate the process of transferring archive administration data and archive files from the legacy system to the ILM retention warehouse system. The new function also provides an automated process for converting the transferred files to apply retention rules and store those files in the storage system. The original files of the converted files can be deleted manually or automatically through this function.

 

Destruction of Archived Data

In the first release of SAP NetWeaver ILM destruction for database data was introduced. Using transaction ILM_DESTRUCTION you can now also destroy archived data (including attachments and print lists) residing in the storage system. It goes without saying that this process always performs in line with the defined ILM retention rules: only data whose expiration date has passed is eligible for destruction and can be added to the work list of objects to be destroyed.

 

Checksums

Using transaction ILMCHECK, you can now verify the completeness of the information in the retention warehouse against the original data transferred from the legacy system. In your checksum analysis you can compare data from the legacy system and the same data in the retention warehouse system to see whether the sums of specific data or ranges of data are identical.

In the screenshot below you can see the result of a checksum run:

 

 

Availability

SAP NetWeaver ILM 7.02 comes as a part of SAP NetWeaver 7.0 EhP2 and is included in the SAP ERP 6.0 EhP5 delivery. After installing SAP ERP EhP5 you only need to activate the relevant ILM business functions to start using it in a live system environment. To use it for system decommissioning, you need to set up a stand-alone retention warehouse system also based on SAP ERP EhP5.

More Information

  • Release Note: Includes a full list of new and enhanced features.
  • Documentation: Contains the “operating instructions” for using SAP NetWeaver ILM.
  • SAPinsider article: Describes the major features of the new release.
  • Installation Note: Provides information on using SAP NetWeaver ILM in the context of SAP ERP EhP5.

Stay Informed

  • SAP NetWeaver ILM on SDN
  • SAP NetWeaver ILM on Twitter

Looking for a way to monitor SAP Data Archiving Jobs?

$
0
0

I have been doing SAP Data Archiving the old-fashioned way for quite some time. Submitting a job and watching it run and waiting for it to complete.  It is the equivalent of watching a pot of water and waiting for it to boil – “a watched pot never boils”.   I finally came to the conclusion that there has got to be a better way to monitor these jobs.   

So, I started doing some research and came across 2 SAP supplied background events (transaction SM64):

Now came the tricky part.  I needed some way to get notified that these events occurred.  And, SAP Archiving jobs are not scheduled like other jobs.  When you schedule the archive write job via the ADK (Archive Development Kit), there is a “sub” job that actually kicks off the archive write job, so, monitoring these jobs can be complicated.  I also did not want to get a notification for EVERY archiving job (write, store, delete) – as that would be a lot of notifications.

I decided to go with getting a text message when an archive write job completed.  For most transaction data archive objects, I do not have them configured to automatically submit the relative delete job.  I have a window (a small window) where I can schedule jobs.  So, I want to be able to control when, and how many, archive delete jobs get scheduled. 

The next step was the actual notification.  I had help – someone wrote a quick Perl Script that would send a text message.  I chose to have the following message sent - “Get out of bed and check archiving job”.  All the archiving people that read this will be able to relate to that message. 

The final step is tying it all together – the Perl script with the completion of the archive write job.  To do that, a SAP external command was set up via transaction SM49. 

The Parameters for operating system command was configured to point to where the Perl script is located.  Save the external command.

Then, schedule the monitor job/program via transaction SM36:

Click on Step -> External command:

Enter the name of the external command that was previously created.  Also, enter the Operating system and Target server. 

Save and then click on Start Condition:

Click on After event:

From the drop-down, select the event you wish – in this case SAP_ARCHIVING_WRITE_FINISHED.

If you leave the Parameter field blank, this will execute the external command (in this case the text message notification) after every archive write job.  If you want to submit it for specific archive write jobs, you can wait until after the archive write job has started and then enter the archive session number in the parameter field.  This would limit the notification for just that specific archive write job.

Then save this job.  You will get the following message:

This job will then get kicked off after an archive write job has completed successfully. 

 Hope this helps someone as it helped me.

Did you all know there is a new discussion forum on SCN?

$
0
0

Just thought I would let people know there is a new discussion forum out in the EIM space called Information Lifecycle Management.

karinSNAG-0918.jpg

 

I have been working with SAP Archiving for a long time (since 1999) and whenever I would go out to SCN to see if someone needed archiving help, I would have to do a general search across all forums on words such as: archiving, SARA, SARI, and ILM.  The questions were posted in all sorts of forums such as in the relative modules ERP SD, ERP MM, etc. as well as SAP NetWeaver Administration and ABAP, General.

A few months ago, I came across the following forum Re: How about a separate forum for Archiving / DVM / ILM?

karinSNAG-0916.jpg

I thought this was an excellent suggestion, so I forwarded this thread to the SAP ILM team to see what they thought.  They thought it was a good idea as well to have a dedicated forum for archiving and ILM related questions and topics.  So, they contacted the SCN team and worked out the logistics.  Moderators were also assigned to monitor the forum.  It was then ready to be launched!

There are already a few discussions going on.  So, if you are interested in or have questions related to SAP Archiving & Information Lifecycle Management, you now have a common place to post and check for answers/solutions to issues.

A One-Stop Shop for Managing All of Your Business Information

$
0
0

 

When managing the lifecycle of your business information you need to follow a holistic approach that covers all of types of information in all relevant systems: structured and unstructured information from live and legacy systems. And when it comes to decommissioning redundant legacy systems your information lifecycle management (ILM) strategy must include both SAP systems and non-SAP systems. For compliance reasons you cannot omit any of these information types or systems. 

With its set of dedicated products and services including SAP NetWeaver ILM, SAP BusinessObjects Data Services, and SAP Document Access by OpenText, SAP’s ILM solution suite enables you to establish a truly holistic ILM strategy for your enterprise to control the entire lifecycle of your information. That’s why we like to call it a “one-stop shop” for all your ILM needs. It helps you to reduce system complexity and cost, preserve retention-relevant information, and remain legally compliant with regard to information retention.

“Whether you need to manage structured or unstructured information in your live SAP ERP systems, or decommission any SAP or non-SAP systems, SAP’s ILM solution suite will support you.”

To learn more, read our article “A One-Stop Shop for Managing All of Your Business Information” at http://sapinsider.wispubs.com/Article/A-One-Stop-Shop-for-Managing-All-of-Your-Business-Information/6099.

Ramp-up of SAP NetWeaver ILM 7.03 Started

$
0
0

The new release of SAP NetWeaver Information Lifecycle Management (SAP NetWeaver ILM) comes as part of SAP NetWeaver 7.0 EhP3 and will be delivered to ramp-up customers together with SAP ERP 6.0 EhP6. SAP NetWeaver ILM 7.03 includes several great innovations, such as the new ABAP-based Storage and Retention Service (SRS), a standalone ILM Legal Hold Management, a rule simulator for retention rules, a streamlined system decommissioning process, and many more. With the new features you can reduce the complexity of your IT landscape and increase the productivity of your administration staff.

Below is a brief overview of the major enhancements. Information on other enhancements will be provided in the official Release Note when available on the SAP Help Portal.

Reduced IT Complexity and Cost

  • The new Storage and Retention Service (SRS) is available in the application system (AS ABAP). You can use this ABAP-based service instead of the XML Data Archiving Service (XML DAS). This means you can use SAP NetWeaver ILM without a separate AS Java (on which XML DAS runs). Thus, you can simplify your IT landscape and reduce TCO.
  • This new ILM-specific function that can be used as an alternative to SAP Case Management enables you to create and manage legal cases, perform e-discovery, display the related attachments and include them in the hold. It is currently available only for use in live application systems. The complete availability of this function in the ILM Retention Warehouse system (system decommissioning scenario) is planned for future deliveries.

Increased Productivity

  • The menu structure of the ILM Cockpits was reworked to correspond to the current process steps of each ILM application scenario. The menu structures are filled according to the relevant roles. The ILM Cockpits are available in the Easy Access Menu of the application system as well as in NetWeaver Business Client (NWBC). The ILM Cockpits help administrators minimize their effort by guiding them through the ILM process.
  • The Rule Simulator enhances the transparency of retention rules processing. By simulating ILM rule evaluation for an ILM object, you can view the prerequisites as well as the result in a log, based on specific values. The log for rule simulation provides you with an overview of all relevant audit areas, including the existing policies as well as the related BOR objects for which legal holds have been defined.
  • The number of steps for transferring legacy data into the RW system was reduced. You can now execute some of these transfer steps automatically. During the transfer of the archive administration data from the SN_META file, you can automatically transfer all repository tables from the legacy system to the RW system repository. The transfer and storage of the archive files can be automated.

Joining the Ramp-up

If you’re interested in using these new ILM features you can join the ramp-up to get the software delivered. You just need to talk to your SAP account executive or SAP customer engagement manager. They will help you to manage your ramp-up nomination. A dedicated SAP ramp-up coach will be your primary contact during the ramp-up. He or she will guide you through the ramp-up process and help you address problems that may come up, in SAP development.

The ramp-up is expected to last until May 2012. Upon successful completion of the ramp-up, SAP NetWeaver ILM 7.03 will become generally available to all customers.

For more information, please refer to the ramp-up page in the SAP Service Marketplace.

Video on SAP NetWeaver ILM Compliant Archiving

$
0
0

Dear ILM Community,

A new video is available that conveys the very essence of SAP NetWeaver ILM Compliant Archiving in only four minutes!

 

It's simple, crisp, concise ("chalk talk" or Khan style) and therefore ideally suited for those who are new to ILM and would like to grasp its basic idea very quickly.

 

The easiest way is to view the video is on YouTube!

 

Hope you like it!

 

More videos like this are in the making.

 

Best regards,
Helmut Stefani


Video on SAP NetWeaver ILM System Decommissioning

$
0
0

Dear ILM Community,

Here is another short video on SAP NetWeaver Information Lifecycle Management.

 

By watching and listening to it (don't forget to turn your loudspeakers on) you learn how SAP ILM supports you in shutting down unproductive systems in your landscape, and while doing so save costs and fulfill legal obligations.

 

The easiest way is to view the video on YouTube!

 

I hope you like it!

 

Best regards,
Helmut Stefani

SAP Data Archiving Changes in ERP 60 EhP 6

$
0
0

I am currently testing SAP data archiving in ERP EhP 6 SP 3 and have found some issues as well as noticed several changes (from ERP 60 EhP 3) that I want to share with the SAP Community.  This information will be grouped by Configuration Issues, New Functionality for Current Archive Objects, New Archive Objects, and Updated Documentation.

 

Configuration Issues

 

In past tests (upgrades, previous enhancement packs, OSS Notes, etc.) I have found configuration related issues with archive objects that have been touched by IS Oil.  Multiple changes/fixes were needed in order to even begin testing data archiving.  For examples, archive job variants were missing for MM_EBAN, MM_EKKO and MM_EINA.  This was easliy resolved by recreating them via SARA/Customizing Settings.  The archive write job/program failed for SD_VBAK, MM_EKKO and MM_EBAN.  This was resolved by following the instructions in OSS Note 945459 (the note states that it is for SRM, but, SAP verified that it could also be followed to fix this ERP issue).  The archive write job/program also failed for MM_EINA with "Archiving object or class MM_EINA does not contain structure OICQ4".  I compared the MM_EINA archive object structure definition within transaction AOBL from the EhP 6 system with one in a system that had not been upgraded yet.  I found that it was missing 3 segments in the EhP 6 system.  I added those segments and the archive write job then completed successfully.

 

New Functionality for Current Archive Objects

 

MM_EKKO:

 

I noticed that this archive object has new write, delete and preprocessing programs:

SNAG-0662.png

 

I did have to manually change the Preprocessing program to reflect the new program name per OSS Note 1646578.

 

The new version of the preprocessing and write job programs provide an additional option of "Residence Time Check Creation Date".

 

SNAG-0663.png

 

If this new option is not selected, the residence time check is carried out against the last change date of the purchase order instead.

 

SD_VBAK:

 

The archive preprocessing and write jobs have a new option of "Check Valid-To Date".

 

SNAG-0668.png

 

This option only applies to sales documents with a valid-to date (like a quote, scheduling agreement, contract, etc.) and the program assumes the end of validity has been maintained.

 

WORKITEM:

 

The archive write job has a new option of "Delete Unnecessary Log Entries".

 

SNAG-0670.png

 

I have not been able to determine exactly what this means yet as there isn't any SAP Help for this option.

 

PM_ORDER:

 

The preprocessing job has a new field of "Revision".

 

SNAG-0673.png

 

The write job has the new Revision field as well as an additional section for "PS:Project">

 

SNAG-0671.png

 

BC_DBLOGS:

 

The archive write job now provides the functionality to specify by table name, which changes get archived.  Prior to this, ALL customizing tables that had logging turned on were archived.

 

SNAG-0674.png

 

PM_QMEL:

 

The preprocessing program has added fields "Planning plant (IWERK)" and "Revision(S_REVNR)" as well as adding the capability for getting detailed logging information to assist with determining why a notification was not eligible for archving, and saving this information in the application log.

 

SNAG-0676.png

 

The write job also includes the same new functionality as the preprocessing program.

SNAG-0675.png

 

The detailed log and log output functionality has also been added to the preprocessing program and write program for SM_QMEL.

 

New Archive Objects with EhP 6

 

I found that there are almost 200 new archive objects that are delivered in EhP 6.  I will not be going over all of them in this blog , I will be picking a few of them to highlight.

 

Virsa Firefighter Logs:

 

Depending on how you use firefighter id's, you may or may not need to control the growth of the log tables.  OSS Note 1041912 provides some Firefighter Best Practice Archiving Strategy information.

 

If you do use firefighter id's extensively, you can use data archiving for these tables:

 

/VIRSA/ZFFTNSLOG - Firefighter Transaction Log

/VIRSA/ZFFCDHDR - Firefighter Change Document

/VIRSA/ZVIRFFLOG - Firefighter Action Log

 

To archive, use transaction /VIRSA/FFARCHIVE (not through SARA).  Before you can run this transaction, you will need to follow the instructions in OSS Note 1228205 to maintain the path where the archive file will be written as indicated in the "Application Server File' parameter in the below image:

SNAG-0679.png

Additional information on this can be found in the Firefighter User Guide available on the SAP Service Marketplace.

 

BC_E071K:

 

Starting in SAP_BASIS Release 7.x, you can now archive transport information.  Archive object BC_E071K is standard in SAP_BASIS Release 731.  For 70-72, you will need to be at the relevant support package as indicated in OSS Note 1340166.

 

SNAG-0680.png

 

Note that only the entries from table E071K will actually be archived out of the system.  The related entries from tables E070 and E071 will only be written out to the archive file, but not deleted.

 

CA_SE16NCD:

 

Per OSS Note 1360465:  If you use transaction SE16N to make changes to tables, they are updated in separate change document tables.  Depending on the number of changes, the change document tables can be very large.

 

The tables in this archive object are:

 

SE16N_CD_KEY Table Display: Change Documents – Header

SE16N_CD_DATA             Table Display: Change Documents – Data

 

It is recommended to archive this data using date intervals.

 

The archived data can then be displayed/analyzed with report RKSE16N_CD_DISPLAY.

SNAG-0681.png

 

Archiving in GRC Access Control 10.0:

 

There are several new archive objects related to archiving GRC Access Control related data.  They are:

 

GRFNMSMP

Archiving for GRC AC 2010 Requests

SPM_AU_LOG

SPM Audit Log Archive

SPM_CH_LOG

Change Log Archive

SPM_LOG

Archiving for SPM Log Reporting

SPM_OC_LOG

SPM OS Command Log Archiving

SPM_SY_LOG

SPM System Log Archival

 

Updated Documentation

 

The Data Management Guide has been updated as of December 2011.  If you have not downloaded this from the Service Marketplace recently, you should check it out (logon required).

 

SNAG-0677.png

 

To find out what has been added or updated, go to Chapter 2 "Which Tables are Examined".

 

SNAG-0678.png

 

Here you can quickly find out what is new in this version of the document by checking the "Last Changed in Version" column.

 

There are a lot of changes related to SAP Data Archiving in ERP EhP 6.  This blog just highlights a few of them.  I hope you find this information useful.

Compliance Benefits of a Controlled Methodology for Decommissioning Legacy SAP Systems

$
0
0

Many companies have older legacy SAP systems still operational for audit compliance, review and reporting purposes. We know that as data ages it become less valuable and this is especially true for legacy data. However, businesses need to retain data until it reaches its end of life, because of legal holds, in support of governance/audit requirements, or for maintenance histories. 

 

It becomes somewhat of a balancing act as keeping everything. It is not only expensive and inefficient, but it can be contrary to compliance guidelines. For the data that still has value for business and/or audit/governance, there are several options for those operating in an SAP-based solutions environment:

 

  1. Leave the legacy SAP system running
    1. Most expensive as you have to maintain
      1. support pack updates
      2. backup schedules
      3. resources to support the system
      4. pay licensing cost
      • v.     other
  2. Decommission the system and rely on system recovery (most likely from tape) if data is needed
  3. Move the data to less expensive platform
    1. Less expensive and easier to maintain
      1. move to platform that support SAP solutions such as Microsoft
      2. data is compressed
      3. only move over data that needs to be kept
        1. limits governance and compliance issues as only address data being kept
        2. purge data that is no longer required
      4. have one support system with data from multiple SAP systems
    2. Facilitates decommissioning of legacy SAP system
      1. saves space, licensing and support cost (see list above)
      2. easier to purge data once it reaches end-of-life

 

 

What it comes down to is that the important factor to remember is to move the data in a ‘controlled’ manner.  For instance, many SAP systems have older data that is not useful to the groups using the system, so keeping it available in the active system takes up valuable space. With a ‘controlled’ method, you take the time to understand what data is needed for the groups using the system. An example would be Audit/Controls/Governance. You would most likely need financial, controlling and asset data, the latter depending on audit regulation and country. You also want to provide a means for extracting the data that is efficient and allows you to meet response deadlines. Using the specifically developed tools, data resides where there is proper security so that end users can access only the data for which they are authorized.  With these solutions in place, you can select the data to be moved and have a specified way to report only the data needed and control user access to that data through authorizations.

 

Does your company have legacy SAP system and what are you doing about them? 


Step by Step to implement a Destruction object

$
0
0


Our team has implemented the social media channel ( for example twitter and facebook) into CRM 7.0 EHP3 ( detail see here)


For product standard requirements, it is necessary to provide our customer the choice to delete the facebook posts and tweets stored in CRM database table according to their dedicated consent requirement. In this blog I will show you how to implement the destruction object for social post destruction. You can follow it to create your own destruction object.

 

Note

 

ILM Retention Management is a licensed product, which is really due to the fact that the predecessor to full ILM capability was data archiving which was always included as part of the core suite.

 

 

Step1 Create a new entry in tcode DOBJ

clipboard1.png

Double click the created entry, maintain the component for it. In my case I use "Customer Relationship Management" for it.

Also specify the data destruction execution report name.

clipboard2.png

Step2 Do customizing in tcode IRM_CUST

 

Choose OT_FOR_BS

clipboard3.png


Create a new entry for your destruction object:

clipboard4.png

Double click "Allowed Start Time", specify which field is considered by ILM framework to judge whether a post could be destructed.

Here I choose the IRM constant CREATION_DATE. It is just a constant but NOT the field of your database table. Later on we will map this constant to the real field of the database table.

clipboard5.png


In "Allowed Policy Category", choose RTP as policy category.

clipboard6.png

Double click on Object Category-Specific customizing, specify your BOR object name below:

clipboard7.png

Mapping the IRM constant CREATION_DATE to the field of your database table:

clipboard8.png

In my database table I have CREATION_DATE_TIME to store the timestap of post creation, and UUID for post unique identifier.

clipboard9.png

Specify the key field name of your database table.

clipboard10.png

Till now we have finished all the customizing. The new step is to implement destruction report CRM_SOCIAL_ILM_DES.

 

 

Step3 implement destruction report CRM_SOCIAL_ILM_DES

 


The selection screen of report could be found below. It allows you to specify whether the destruction report is only written into application log, or also output to list besides application log, or only list. The Run comment can enable you to maintain an explanation text of each run so that you can easily find the run information later in application log according to the comment.

clipboard11.png

Most of the jobs have been done via the utility class cl_crm_soc_destruct_ilm_tool. The source code could be find from attachment.

 

 

REPORT CRM_SOCIAL_ILM_DES.

 

INCLUDE crm_social_des_sel.

INCLUDE crm_social_des_f01.

INITIALIZATION.

  PERFORM processing_options_text_set.

START-OF-SELECTION.

   CALL METHOD cl_crm_soc_destruct_ilm_tool=>run

      EXPORTING

         iv_comment           = p_coment

         iv_test_mode         = p_test

         iv_detail_log_option = space

         iv_output_option     = p_prot_o.

 

The key point is in method PROCESS_POST_WITH_FOLLOWUP. In this method as developer you do not need to check whether a post could be destructed. Instead, you tell the ILM framework which field of database tabe is used as destruction evaluation and ILM framework will tell you the result whether a post is destructible.

 

 

DATA: lv_field_value   TYPE if_lrm_bs_types=>ty_s_tabname_fieldname_values,

          lx_rule_exec     TYPE REF TO cx_lrm_rule_exec,

          lv_destructible  TYPE lrm_destructible,

          dref_uuid        TYPE REF TO crmd_soc_post-uuid,

          dref_create_date TYPE REF TO crmd_soc_post-creation_date_time,

          dref_type        TYPE REF TO crmd_soc_post-type,

          dref_tab         TYPE if_lrm_types=>ty_t_field_value.

       CLEAR: lv_field_value, dref_tab, lv_destructible, mt_check_field_values.

       lv_field_value-v_table_name = 'CRMD_SOC_POST'.

       lv_field_value-v_field_name = 'UUID'.

       GET REFERENCE OF iv_key-uuid INTO dref_uuid.

       APPEND dref_uuid TO dref_tab.

       lv_field_value-t_field_value = dref_tab.

       INSERT lv_field_value INTO TABLE mt_check_field_values.

       CLEAR dref_tab.

       lv_field_value-v_table_name = 'CRMD_SOC_POST'.

       lv_field_value-v_field_name = 'CREATION_DATE_TIME'.

       GET REFERENCE OF iv_key-creation_date_time INTO dref_create_date.

       APPEND dref_create_date TO dref_tab.

       lv_field_value-t_field_value = dref_tab.

       INSERT lv_field_value INTO TABLE mt_check_field_values.

       TRY.

        mr_irm->get_retention_rule_f_values(

          EXPORTING

            ith_field_values       =     mt_check_field_values " field names and values of an instance

          IMPORTING

            ev_destructible        =     lv_destructible " information about destrucibility

        ).

      CATCH cx_lrm_rule_exec INTO lx_rule_exec.

        mr_ilm_destruction_db_run->error( ).

        " DO YOUR ERROR HANDLING HERE

       RETURN

    ENDTRY.

 

 

In next blog, I will show you how to test the new destruction object via tcode ILM_DESTRUCTION.

How to test destruction object via ILM_DESTRUCTION

$
0
0



In the previous blog we have discussed how to implement destruction object.


In this article, we will look into how to test the destruction object SOCIAL_DESTRUCTION.

 

Note

 

ILM Retention Management is a licensed product, which is really due to the fact that the predecessor to full ILM capability was data archiving which was always included as part of the core suite. 


Step1 Create a new Audit Area via tcode ILMARA

clipboard1.png


Never forget to mark the checkbox “Object Assigned” to make your destruction object assigned to the new Audit area.

clipboard2.png

Step2 Destruction rule maintenance

 

 

Maintain destruction rule for your destruction object in tcode IRMPOL.

clipboard3.png

If for specified Audit area, there is already rule created for it, press Continue (it means you want to edit the existing rule). Or else press New for creation.


Just maintain the policy new and save it.

clipboard4.png

Below screenshot is a sample for retention rule maintenance. It means all social posts which are created ONE YEAR ago must be destroyed.

 

After you finish the rule edit, please make sure the policy status is set to “Live”, or else it will be ignored by rule engine in the runtime.

clipboard5.png

Step3 Launch destruction object


There are several variants which you can use to launch destruction functionality provided by destruction object.

Variant1 - Directly execute destruction program CRM_SOCIAL_ILM_DES on line via SE38

clipboard7.png

 Test Mode: Under this mode, destruction program only does analysis on social post and list for each post, whether it could be destroyed according to resident time maintained in IMG activity and rules maintained in IRM rule maintenance view. No real deletion would be done.

 Production Mode: Do the same analysis as Test Mode. The only difference is the real deletion would be done, if a social post is judged as destructible.

 Output Option:

1. List

The execution result is output as example displayed below:

clip_image008.jpg


2. Application Log

 

clip_image009.jpg

The output will not be output directly to screen but written into application log instead.

You can review all logs by following below steps:

  1. transaction code ILM_DESTRUCTION, maintain the following settings and F8:

 

clip_image010.jpg

b. click button “Protokolle”:

clip_image011.jpg

Just specify the user name and execute selection:

clip_image012.jpg

Then you can view all logs generated during destruction run:

clip_image013.jpg

3. List and Application log

 

The output consists of both List output and Application log record.

 

Comment: This is an indicator for the current destruction run, which can help you identify the logs of the destruction run which you are interested with.

clip_image014.jpg

clip_image015.jpg

Variant2 - schedule the destruction program as background job via tcode ILM_DESTRUCTION

clip_image016.jpg

clip_image017.jpg

There is still time to participate in the ASUG/DSAG Archiving and ILM Survey

$
0
0

FYI - if you have not had time to fill out the survey we do every other year in collaboration with DSAG, you still have time.  The link to the survery is here.  The results will be presented at the 2014 ASUG Annual Conference in Orlando in June.  The session is the Archiving and ILM Open Forum on Tuesday, June 3rd at 12:30 PM.  The results will also be presented at a DSAG event as well.  The results will be posted in the ASUG Archiving & ILM Discussion Forum after the conference.

 

We will leave the survey open for another week!

 

Best Regards,

Karin Tillotson

ASUG Archiving & ILM SIG Program Chair

Archiving SAP data and how to access it after the delete run

$
0
0

Archiving SAP data and how to access it after the delete run

For a customer I needed to archive the different types of data from the SAP system. This is not really difficult via the standard SAP transaction SARA and with the standard archiving objects. My question upfront was: how to access the data after the archiving delete run. So how do we get the SAP users to still see the data that was deleted from the database after the archiving.

I will use two examples with different outcome:

  • the archiving object FI_DOCUMNT
  • the archiving object MM_ACCTIT

I’m a technical consultant and the steps below are from a technical point of view.

Preparations

In this example we will use the file system to store our archiving files.Before we start with archiving we need to specify the archiving filename and archiving directory.

Destination folder and filename

Check the destination folder and name convention.SAP transaction FILEFilename = ARCHIVE_DATA_FILECapture1.JPGDestination folder = ARCHIVE_GLOBAL_PATHCapture2.JPG

Check tables in archiving object

Before we start archiving of course we need to decide together with the business what data to archive and what the retention times will be. In this example we will focus on the technical part of archiving.Check which tables are touched with a certain archiving object.SAP transaction DB15Capture3.JPG

Execution

For archiving we use SAP transaction SARA.

From a high level point of view the procedure is basically the same for FI_DOCUMNT and MM_ACCTIT:

  1. Write (the data to the archive files)
  2. Delete (the data from the database)
  3. Management (to see the status of the archiving)

The SAP transaction SARA for the archiving object FI_DOCUMNT looks like:Capture4.JPGSo the first step is Write.Simply click on the Write button and the next steps are self explainatory. You have to create a variant in which you specify which data you want to archive. You have to specify a printer and a startdate.After this, background jobs are created and started.After this you can check the joblogs, spool files, files generated at operating system.Also check the SARA Management button to see the status.(status will be yellow because the delete run has not been done)The step after this is the Delete.Simply click on the Delete button and the next steps are self explainatory. You have to select the archive files from the previous step and this data will be deleted from the database.After this you can check the logs and check the SARA Management button to see the status.(status will be green)Specifically for the FI_DOCUMNT there is the step “PostProc” in SARA. So this one is not available for MM_ACCTIT.

View data from archive file after the delete run

So how can SAP users still see SAP data from within the SAP system?This differs per archiving object.Therefore in this example we have two archiving objects.FI_DOCUMNTSo the data is archived via SARA and is deleted from the database.The normal transaction that users perform does not show data unless we perform some actions.Create a secondary index via SAPF048S:SE38 > SAPF048SCapture6.JPGUse the Archive selection button to select the archived data.Uncheck Creation only Acc to RuntimeCheck Production ModeSAP transaction SARI (indexing and archive files)Click on “Customizing”Search for *FI_DOC* and choose SAP_FI_DOC_002And choose ActivateCapture7.JPGThis on itself is not yet enough, we need to perform a build using menu > Goto > buildAfter this the status in SARI looks good:Capture8.JPGNow we are done.And besides the fact that we can display the archived and deleted data via SAP transaction SARI we can also display it directly via FB03 or ALO1So after the SARA delete run, to make the data still available for SAP users via the original transactions, we have to:

  • Create index via SAPF048S
  • via SARI activate Archive Infostructure SAP_FI_DOC_002
  • via SARI build Archive Infostructure

 

 

 

MM_ACCTIT

So for the data archived via FI_DOCUMNT we were able to have the data accessible via the original transactions. Unfortunately this is an exception. Normally data is only accessible via transaction SARI. We will see this with the data that is archived via MM_ACCTIT.

 

So the data is archived via SARA and is deleted from the database.

 


SAP transaction SARI (indexing the archive files)

Click on “Customizing”

Search for *ACCTIT* and choose SAP_MM_ACCTIT02

 

And choose Activate


Capture9.JPG


This on itself is not yet enough, we need to perform a build using menu > Goto > build according to date

 

After this the status in SARI looks good:

 

Capture10.JPG

 

Now we are done.

We can display the archived and deleted data via SAP transaction SARI > Archive Explorer.

 

Capture11.JPG

 

Best regards,

Marco


Starting with the SAP Data Retention Tool (DaRT)

$
0
0

Starting with the SAP Data Retention Tool (DaRT)

 

For a company I work for there was a need for archiving data as the database was growing. And there was also the need to keep data available for future audits.

But what if we delete the data from the database and it is needed for an audit? Well then we still have the archive files and we still can view the data.

See "Archiving SAP data and how to access it after the delete run"

 

Although that is true, the customer wanted to work with DaRT extract to be able to give a future auditor exactly what they needed. And this DaRT process is normally done before the archiving project starts.

So in the below steps I will show you the steps how to configure and create a DaRT extract.

It will be shown from a technical point of view (as I'm a technical consultant).

Of course you also need a functional- or business-consultant to determine the data to be captured (for example the period and the company code).

 

 

First a small definition of DaRT, compared to ILM (information Lifecycle Management):

(see http://scn.sap.com/docs/DOC-28852)

 

DaRT is a tool offered as part of the SAP Business Suite. The main purpose is to retain the relevant finance and tax data from a productive system to make it available to auditors. The data being exported (so called DaRT extracts) is not deleted within the productive database which leads to a duplication of data. As soon as the data is needed to support an audit, the exported files are imported into the “DaRT Viewer”

 

SAP Netweaver ILM is a total solution of which Retention Management is part of. It applies to productive systems and enhances archive files with their specific retention time, stores the files and enables automated destruction once the retention time expired. The technical foundation is the standard data archiving provided as part of the SAP Business Suite.

 

 

In this example I performed the DaRT extracts for a French company code and there were a few notes to be imported specifically:

0001906078 -  FTW1A: BADI for additional selections from BKPF

0001923322 -  DaRT: View file creation for France Legal Requirement

0001935509 -  Carry forward postings for new GL in France

 

Related transaction codes

First an overview of DaRT related transactions:

 

Capture1.JPG

 

DaRT configuration

SAP transaction FTWP

In this example we choose Financial documents and Tax data.

 

Capture2.JPG

 

Go to tab File directories and create a new one:

 

Capture3.JPG

 

The 0 is placed automatically. Leave this field blank.

If your SAP system has separate application servers, make sure this directory is accessible from all systems.

 

The other tabs we can leave default



Create DaRT extract

We use SAP transaction FTW1A

Here we choose need the functional- or business-consultant to determine the data to be captured. In this example we choose a certain company code and a certain fiscal year.

 

Capture4.JPG

 

Result:

 

Capture5.JPG

 

Capture6.JPG

 

At operating system we see that the files are created:

 

Capture7.JPG

 

 

DaRT logging

Use SAP transaction FTWL to see the DaRT logging after the run is done:

 

Capture8.JPG

 

 

Capture9.JPG

 

DaRT extract browser

SAP transaction FTWF to browse through the DaRT extract

 

Capture10.JPG

Capture11.JPG

 

 

Double click on a line to see the detailed data.

 

 

DaRT data view queries

Besides the SAP transaction FTWF to browse through the DaRT extract we have transaction FTWH for the DaRT data view queries

 

Capture12.JPG

 

Because we created a Tax related DaRT extract we now choose: "0SAP_BSET  Tax data document segment".

In the next screen select the source extract and directory set and execute.

 

Capture13.JPG

 

Then the data is shown which an auditor needs to see/review

 

Best Regards,

Marco

Retirement takes planning AND implementing.

$
0
0

I went to a retirement party last week.  I pictured we would be sitting outside on the deck where hors d’oeuvres would be served with the fanciful summer cocktail menu.  But alas, it was dull and dreary in the basement where co-workers had assembled to bid their adieu.  The recipient of the send-off had served the company unfailingly for over 30 years, but had indeed found a storied skill-set had been replaced by a better connected, shinier, faster face.  The retirement had not really been planned, it was more like someone created a flash mob at the behest of the auditors.  Not the type of orderly and grateful departure one might expect at the end of such a long-term service engagement.

 

The future is here, broadcast in the bright Technicolor of our youth, with new sounding names and labels like Nexus of Forces, Big Data, Mobile, Social Engagement, the Internet of Things and of course, the ubiquitous CLOUD.  In the rush to adopt the latest and greatest technological innovation to preserve a competitive edge, we are forgetting about the seasoned veterans and the secrets they keep.

 

Application retirement, or decommissioning, is about managing the safe passage of data and documents from online or archived databases in old systems that are no longer used for current business processing.  The information must be retrieved from the retiring system and migrated to a new location, where it can be accessed and managed as necessary. Migration offers more cost effective storage options.  The retirement of the application eliminates the cost of keeping a fully operational system up and running to simply serve the purpose of providing an access point to the data when that auditor rings your phone.  There is also a risk in keeping these systems around: old technology is, well, old.  The aging process is not graceful for many of these legacy systems, and they tend to break down at the most inopportune times.

 

Consider the case where a global manufacturer faces a product liability issue that requires accessing years-old data.  The fellow who had been entrusted to maintain that database, with the checkered sport coat and too-wide tie, has recently retired himself.  Once the code is cracked and the system is finally accessed, it is determined that the disk with critical data has actually failed.  A painful litigious process just got a whole lot worse. 

 

Legacy decommissioning is a task that never seems critical enough to get the nod from the budget allocators until it’s too late.  Part of the problem is that it is not well-enough understood to make the business case in words or numbers.  But it should be on every IT/data management roadmap to ensure your organization remains compliant and protected.  Living with the risk should not be the strategy for these systems.  Sipping fancy iced drinks knowing critical data and documents are accessible at any time sounds a whole lot better.  

How to Find a Certified ILM Storage Partner

$
0
0


Dear ILM community,

As some of you may have noticed, the SAP Partner Directory website where you can look up certified SAP partners has changed. It is now available under the following link:

 

http://global.sap.com/community/ebook/2013_09_adpd/enEN/search.html

 

If you are looking for a certified WebDAV storage partner for SAP ILM, you simply have to enter 'BC-ILM' in the search field. To distinguish between the 2.0 and 3.0 certification, simply enter 'BC-ILM 2.0' or 'BC-ILM 3.0'. See screenshot below:

 

BC-ILM.png

 

 

The matrix (columns) on the right hand side shows the expertise of the partners, i.e. in which area they are particularly engaged with SAP. However, this matrix seems to be incomplete right now, so please disregard.

 

Best regards,

Helmut Stefani

Understanding SAP ILM Archiving Fiori applications in 5 questions

$
0
0

1. What are the Fiori applications available related to the ILM Archiving feature?

We have delivered 2 Fiori applications to help an Archiving Expert (or an equivalent role) to manage data volume by identifying high growth possibilities in database tables and also to monitor archiving jobs.

The Fiori applications which support these activities are as follows -

  • "Analyze Archiving Variants Distribution" Fiori application - This application helps the Archiving Expert to visualize the impact of the archiving runs on the database volume. It helps to identify ineffective variants used in the archiving runs and helps define better variants.
  • "Monitor Archiving Jobs" Fiori application - This application helps the Archiving Expert to monitor existing archiving runs. It also provides the added benefit of the possibility to take action, whenever required, by deleting, terminating or retriggering jobs in a seamless manner without having to go through the normal transactions.

 

2. What are the major features of the Fiori applications?

"Analyze Archiving Variants Distribution" Fiori application

  • Identify empty runs easily and share in SAP JAM
  • Identify variants which are ineffective and change the variants as required

"Monitor Archiving Jobs" Fiori application

  • Archiving jobs categorized by their status
  • Easy to identify jobs which require action
  • View logs (Job and Application) and send Log by mail
  • Reschedule or delete Archiving Jobs

 

3. How do the applications look like?

We have a speed demo which will allow you to click through the applications. It can be accessed using the link-  http://demo.tdc.sap.com/speeddemo/3a22eb92f71f31c2

 

4. What are the technical pre-requisites to implement the Fiori applications?

The details about the applications (including the technical pre-requisites) are published in the help portal - http://help.sap.com/fiori_bs2013/helpdata/en/5c/0de9522806b267e10000000a441470/frameset.htm

 

5. Where/how do I get more information about the Fiori applications?

We plan to receive feedback by allowing the end users to test the applications during various SAP events or by organizing individual workshops based on interest. If you are interested in participating in these tests or knowing more about the applications, please get in touch with santosh.v@sap.com.

Video about SAP ILM Archiving Fiori applications

$
0
0

The SAP Fiori applications focusing on archiving has been explained in the SCN blog Understanding SAP ILM Archiving Fiori applications in 5 questions. We have also created a YouTube which explains how the Fiori applications will help an Archiving Expert get access to all the necessary data in one location and also trigger corrective action, if required.

 

Viewing all 30 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>