Showing posts with label Storage Extension. Show all posts
Showing posts with label Storage Extension. Show all posts

Monday, 1 July 2019

Extending Storage Type in SDL Tridion DOCS 2013 SP2

Recently, I got the opportunity to explore and work on SDL Tridion Docs 2013 SP2 where I need to develop an extension. The client is using their own ElasticSearch instance and wants us to continue using the same instance to index DOCS data in it.

To start with I first explored the SDL DOCS documentation to understand if at all storage extension is supported or not and yes Extension is available not just storage but you can extend the Deployer as well :). "Maybe in the next blog".

After, spending some time on Documentation I was clear what I need to do in order to extend the storage type.

For the initial steps "How to set up the JAVA project and generate custom.JAR" you can follow my previous blogs. In SDL Tridion DOCS, we have the following Content Delivery roles and with that data is published into UDP (Unified Delivery Platform).

  • Discovery Service
  • Content Service
  • Context Service
  • IQ Service for Indexing and Querying data In/From Elastic
  • Contextual Image Delivery
  • UGC Service
Coming back, to Extending the Storage we need to configure the Deployer, here we need to deploy custom jar, customDAO.xml and configure the cd_storage file, similar to what we do when extending the storage with Tridion WEB/SITES.

Below is the output captured in the log file in the JSON format after we deployed the custom jar file that contains code to extend and published the item from SDL Tridion DOCS 2013 SP2.


JSON in deployer logs

JSON of Dummy content

In the next blog, we will see what else we can do with this and how much flexible the extension point of storage is, Until then.

Happy Coding and Keep Sharing!!!


Sunday, 16 June 2019

Extending Content Delivery Storage in SDL WEB 8.5 - Part 3

In the last blog, we saw what happened when we publish a new component, it only calls the create method, but when we re-publish an item it first calls the remove method and then create.

Remove Method


public void remove(int publicationId, int componentId, int componentTemplateId, ComponentPresentationTypeEnum componentPresentationType)
throws StorageException
{
log.debug("Custom storage remove Method");
super.remove(publicationId, componentId, componentTemplateId, componentPresentationType);
log.debug("Custom storage remove Method :- "+componentId);
 
}
Re-Publish an item


As you can see when I re-published an item, we have remove method invoked first and then create method. Next, is un-publish an item and see how it works/ in what sequence.


Un-Publish an Item

When we publish a new item only create method is called, re-publishing will call the remove method and then create and finally in un-publish only remove method is called.   


In the last two blogs, we learned how to set up and configure the project to build storage extension and Publishing a new item and in this blog, we saw other functionalities as well and how they are invoked, in what sequence/order, with this approach we can add/update/remove DCPs in custom storage e:g SOLR, ElasticSearch, MongoDB, etc.

This data can further be used for analytics and for third-party applications.


Happy Coding and Keep Sharing !!!



Extending Content Delivery Storage in SDL WEB 8.5 - Part 2

In the previous blog, we set up the project and configured all the required config and based on that we published a DCP and saw custom logs are available which means everything is working fine.

Today we are going to read the content of DCP and will see how Storage Extension work when we publish, re-publish and un-publish an item.

The Content is available in the form of Bytes and we need to convert that, below is the code snippet that will allow you to access the DCP content in your custom code.

public void create(ComponentPresentation itemToCreate, ComponentPresentationTypeEnum componentPresentationType)
throws StorageException
{
          super.create(itemToCreate,componentPresentationType);
          log.debug("Custom Code Create Method COMID :- " + itemToCreate.getComponentId());
          log.debug("Custom Storage create Method:- "+ itemToCreate.getContent());
          log.debug(String.valueOf(itemToCreate.getComponentId()));
          byte[] dcpComponent= itemToCreate.getContent();
          try
          {
                  componentPresentation = new String(dcpComponent,"UTF-8");
                  log.debug("Custom Storage create DCP :- " + componentPresentation);
          }
          catch (UnsupportedEncodingException ex)
          {
                  log.error("Custom Storage create Unsupported " + ex);
          }

}

DCP content

Publish a new component and in the deployer log file, you will see the DCP content but what will happen when we re-publish the same component or un-publish this. How we are going to manage that in the custom code, will see in the next blog, until then.


Happy Coding and Keep Sharing !!!





Saturday, 15 June 2019

Extending Content Delivery Storage in SDL WEB 8.5 - Part 1

In this blog, We are going to extend the Storage layer in SDL WEB 8.5 but first, we need to set up the project and will make sure that the custom code interacts with the default process.

Pre-Requisites
  1. Eclipse or you can use your favorite JAVA IDE. You can also follow my previous blog on how to set up the Project in Eclipse and generate custom JAR. [here]
  2. JAVA 8
  3. Default SDL JARs
We can customize the way existing items are stored by the Storage Layer using the DAO "Data Access Object" implementation pattern. We also need to extend JPAComponentPresentationDAO and implement ComponentPresentationDAO and this will allow us to extend the Storage Layer for Dynamic Component Presentation.

Step 1. Create a JAVA class file and add the following Namespaces.
import com.tridion.broker.StorageException;
import com.tridion.storage.ComponentPresentation;
import com.tridion.storage.dao.ComponentPresentationDAO;
import com.tridion.storage.persistence.JPAComponentPresentationDAO;
import com.tridion.storage.util.ComponentPresentationTypeEnum;
import java.io.UnsupportedEncodingException;
import java.util.Collection;
import javax.persistence.EntityManager;
import javax.persistence.EntityManagerFactory;
import org.springframework.context.annotation.Scope;
import org.springframework.stereotype.Component;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
Step 2.  Extends the class JPAComponentPresentationDAO and implement  ComponentPresentationDAO. We also need to insert a @Component and a @scope("prototype") statement between import statements and class definition and the following line of code.

Custom Storage Extension

Step 3. Create a storage DAO bundle config file.



Step 4. Copy this config file in the Deployer config folder and open the cd_storage config file in your favorite editor and add the following element.

Edit storage config file.

Step 5. Copy the Custom JAR file in the deployer and restart the deployer.

Custom JAR.


Step 6. We are done with code and configuration, now we need to publish the DCP and check the logs.

Logs

In the log file, we can see the custom log is available and it's returning the Component ID which means the code is interacting with the default process.


In the next blog, we will see the difference between other Methods, how our code will change on Publishing, Unpublishing and re-publishing and how to read DCP content, Until then.

Happy Coding and Keep Sharing !!!





Sunday, 2 June 2019

Custom Deployer Extension in SDL WEB 8.5 Part - 1

Today, we are going to see how to extend the default SDL Tridion deployment functionality, using Custom Deployer Extension. Before we move forward and start building/set up the project let's first understand the use of Deployer Extension. When a user publishes content, the Content Deployer unpacks the incoming Transport Package and processes its transport instructions and we can extend the default behavior of the Content Deployer by creating a custom Module and adding it to a Step, or by extending an existing Module.

Deployer Extension:- 
  • Deployer Extension is used to inject additional functionality in the default SDL Tridion deployment process. 
  • If you want to implement your custom logic/data. "Event System might be useful here as well ðŸ¤” We need to be absolutely sure about it".
  • If you want to use some specific data/info which is only available at the time of deployment.   
  • Based on JAVA.
  • You can extend the default behavior of the Content Deployer by creating a custom Module and adding it to a Step, or by extending an existing Module.
In order to set up the JAVA project for the Deployer Extension, I have used Eclipse IDE, default JARs provided by SDL and JAVA 8. 

First, we need to configure the JAVA project using Eclipse.

1. Click on File --> New --> Other Project and then select Java Project and click Next.

Create New JAVA Project

2. Enter the Project Name

Enter the Project Name.

3. Create a new folder called "lib" which will contain all the JARs.

Create new Folder lib

4. Copy all the JARs in lib, you can find all the default JAR files in the deployer microservice lib folder. Once copied then we need to Add the Build PATH.

Add JARs Build Path


5 Next, is create a Java package and then create a CLASS file.

Create a new Java Package

Create a New Class
6. Start Importing all the Namespace that's required in order to extend the deployer.

Tridion Default Namespaces.

7. Next is we must implement the abstract method  Module.process(TransportPackage)

 package com.tridion.deployer.extension;  
 import org.slf4j.Logger;  
 import org.slf4j.LoggerFactory;  
 import com.tridion.configuration.Configuration;  
 import com.tridion.configuration.ConfigurationException;  
 import com.tridion.deployer.Module;  
 import com.tridion.deployer.ProcessingException;  
 import com.tridion.deployer.Processor;  
 import com.tridion.transport.transportpackage.TransportPackage;  
 public class PurgeCache extends Module {  
      protected static Logger log = LoggerFactory.getLogger(PurgeCache.class);  
        
      public PurgeCache(Configuration config, Processor processor) throws ConfigurationException  
      {  
           super(config,processor);  
      }  
      // This is called once for each Transport Package that is deployed.  
      @Override  
      public void process(TransportPackage data) throws ProcessingException   
      {  
            log.debug("This is custom logs");  
            int publicationId =data.getProcessorInstructions().getPublicationId().getItemId();  
            log.debug("PublicationId : " + String.valueOf(publicationId));  
      }  
 }  


That's it Export this into a JAR and next is the configuration in the deployer_config.xml of Deployer Microservice. "Don't forget to restart the service".


Click Export

Click Jar file and then Next
Custom Deployer Extension JAR file is ready


Finally, we need to deploy the custom JAR in the deployer microservice lib folder and open the deployer_config.xml file in your favorite editor add the below configuration, save the file Don't forget to take a backup before you start editing.


 <Step Factory="com.sdl.delivery.deployer.steps.TridionExecutableStepFactory" Id="PurgeCacheSteps">  
   <Module Type="PurgeCache" Class="com.tridion.deployer.extension.PurgeCache">  
   </Module>  
 </Step>  

 We also need to edit the logback.xml

 <logger name="com.tridion.deployer.extension">  
     <appender-ref ref="rollingDeployerLog"/>  
 </logger>  

Let's do some publishing and see if everything is working fine. Ideally, we should be able to see the logs entries. If you see the custom logs in the log file that means your custom code is interesting with default deployment process and you've built the extension successfully.

Log File


In the next blog, we will use the deployer extension with a very interesting use-case, until then.

Happy Coding and Keep Sharing !!!



Sunday, 16 April 2017

Generic Storage Extension Solution with ElasticSearch,MongoDB and SOLR

In my last couple of blogs we have discussed on how to extend storage capability of SDL WEB 8. Today we are going to create one common solution where we can decide out of the following which one we want to use as a extended storage medium by just updating the configuration file.This will give you the flexability to select the storage medium you wanted to enable.
  1. Elastic Search
  2. MongoDB
  3. SOLR 
  4. You can also use Custom DB.
Let's discuss what is storage extension?

Storage Extension :-Is the capability of using custom storage medium to store the data.When we publish the data it goes in to the BrokerDB or on FileSystem as per the setting but we can extend this functionality by using storage extension technique.

How we do that :-We need to override methods which published and un-published(create,update and remove) component presentation to the Broker DB and need to update the cd_storage_conf.xml file to inject your custom storage extension code.

High-Level GenericStorageExtension Architecture Diagram

You can download the sample code from here

Happy Coding and Keep Sharing !!!!


Monday, 23 May 2016

MongoDB Integration 4 Tridion-Mi4T

Introduction

MongoDB Integration 4 Tridion-Mi4T is intent to provide the Tridion integration with MongoDB

Let's see some of the features advantages of using MongoDB

Advantages
  1. Schema less : MongoDB is document database in which one collection holds different different documents. Number of fields, content and size of the document can be differ from one document to another.
  2. Structure of a single object is clear
  3. Deep query-ability. MongoDB supports dynamic queries on documents using a document-based query language
Why should use MongoDB
  1. Document Oriented Storage : Data is stored in the form of JSON style documents
  2. Index on any attribute
  3. Replication & High Availability
  4. Rich Queries
Where should use MongoDB?
  1. Big Data
  2. Content Management and Delivery
  3. Mobile and Social Infrastructure
  4. User Data Management

How to Set-up and Configure Mi4T

We have five different modules in the Mi4T
  1. MongoDB - Setup and configuration
  2. Template Bulding Block:- C# TBB is used to get the component DCP in XMLformat after some changes . 
  3. Custom Storage Extension – A JPAComponentPresentationDAO based custom storage extension to manipulate the dynamic component presentations
  4. MongoDBIndexService- WCFRestFul Service which get invoked by custom storage extension and take DCP as input
  5. MongoDBSearchService - WCFRestFul service to get the data from MongoDB and take input query in JSON format .

SETUP

Below are various setup steps
  1. CMS Setup
    • Copy and paste the templating building block (TBB) to a location on your Tridion CM Server 
    • Upload MI4TIndexing.Templating.dll TBB to Tridion CMS 
    • Create a Component Template with following attributes
      • Output Format – XML Fragment
      • Add GetComponentAsXML TBB ,Publish Binaries in Package, Link Resolver and Cleanup Template 
  2. MongoDB Setup
  3. Content Delivery Setup
    • Open the cd_storage_config.xml Storage Configuration file from the /bin/config folder and add following node under the Storages section:
      • <StorageBindings><Bundle src="CustomStorageDAOBundles.xml"/></StorageBindings>
    • Copy and paste CustomStorageConfig.xml file to change the value of following nodes
      • ServiceEndPoint - URL of the IndexService
      • TemplateIdToIndex - Tcm Id Of component Template which we have created in step 1 CMS setup. 
    • Copy the CustomStorageDAOBundles.xml XML file in the Content Delivery /bin/config folder
  4. MongoDB Index Service
    • Copy and paste MongoDBIndexService on your server host it in IIS 
    • Copy the configuration folder as well .
      • You can update the log files path from Logging.config inside the configuration folder
    • Update the path of configuration\logging.config file in web.config of index service
  5. MongoDB Search Service
    • Copy and paste MongoDBIndexService on your server host it in IIS 
    • Copy the configuration folder as well 
      • Update the log files path in Logging.config inside the configuration folder
    • Update the path of configuration\logging.config file in web.config of index service
  6. Index service support Publishing ,Re-publishing and un-publishing
  7. Search service will get the data in JSON format. 

To test the index and search services

  1. Index service

    1. You can use fiddler for debugging 
    2. I have created a sample schema article
      1. title
      2. description
      3. imageurl
    3. Run index service on fiddler
      • http://localhost/Service1.svc/AddDocument
      • Input JSON which will generated by custom storage 
      • {"ServicePayload":{"DCP":"<Content xmlns:xsi='http://www.w3.org/2001/XMLSchema-instance' xmlns:xsd='http://www.w3.org/2001/XMLSchema' xmlns:xlink='http://www.w3.org/1999/xlink' xmlns:tcm='http://www.tridion.com/ContentManager/5.0' Title='Copy of Demo Of MongoDB' Id='tcm:2073-13667'><title>Demo Of MongoDB</title><description><![CDATA[Demo of component creation in Tridion using MongoDB]]></description><imageurl>/images/demo.png</imageurl><publication Id='tcm:0-2073-1' Title='03 Content Master' /></Content>","LanguageInRequest":"en"}} 
    4. Execute this request in Fidler and verify the results returned
    5. Check in mongoDB as well inside the collection\document which you have created and entered in the index service 
    6. Result as 0 for success and 1 for failure
      •  {"ResponseContext":{"EnvironmentContext":null,"FaultCollection":[]},"ServicePayload":{"ErrorMessage":"","Result":0}}
    7. Test this by publishing the Component as well 
    8. Log File Where Response Result :0 means successfully published and data is index in MongoDB
  1. Search Service
    1. You use the fiddler for debugging 
    2. Run the search service on fiddler
      • Url:- http://localhost/SearchSvc.svc/GetContentFromMongoDB
      • {"ServicePayload":{"ContentType":"Content","Filters":[{"Key":"ItemURI","Value":"tcm:2073-13667"},{"Key":"publicationID","Value":"tcm:0-2073-1"}]}
      • Execute the request in fiddler and verify the results returned
      • This service will get you the result based on Filters you provide and  use //MAP/REDUCE  
      •  //Map/Reduce            
                            var map =
                                "function() {" +
                                "    for (var key in this) {" +
                                "        emit(key, { count : 1 });" +
                                "    }" +
                                "}";

                            var reduce =
                                "function(key, emits) {" +
                                "    total = 0;" +
                                "    for (var i in emits) {" +
                                "        total += emits[i].count;" +
                                "    }" +
                                "    return { count : total };" +
                                "}";
      • Output will be in the JSON format 
      • { "_id" : ObjectId("5741539eef525465db9eb131"), "title" : "Demo Of MongoDB", "description" : "Demo of component creation in Tridion using MongoDB", "imageUrl" : "/images/demo.png", "ItemURI" : 
      •  
      • "tcm:2073-13667", "publicationID" : "tcm:0-2073-1" }
      • You can have AND,OR and NOT as a logical operators to query the data from MongoDB
        • {"ServicePayload":{"ContentType":"Content","Filters":[{"Key":"ItemURI","Value":"tcm:278-13667"},{"Key":"publicationID","Value":"tcm:0-278-1"}],"MongoDatabase":"customerDatabase","Table":"article","QueryType":"OR"}
  2. You can download the code here

Happy Coding & Keep Sharing !!!

MongoDB Integration 4 Tridion-Mi4T

Introduction

MongoDB Integration 4 Tridion-Mi4T is intent to provide the Tridion integration with MongoDB

Let's see some of the features advantages of using MongoDB

Advantages
  1. Schema less : MongoDB is document database in which one collection holds different different documents. Number of fields, content and size of the document can be differ from one document to another.
  2. Structure of a single object is clear
  3. Deep query-ability. MongoDB supports dynamic queries on documents using a document-based query language
Why should use MongoDB
  1. Document Oriented Storage : Data is stored in the form of JSON style documents
  2. Index on any attribute
  3. Replication & High Availability
  4. Rich Queries
Where should use MongoDB?
  1. Big Data
  2. Content Management and Delivery
  3. Mobile and Social Infrastructure
  4. User Data Management

How to Set-up and Configure Mi4T

We have five different modules in the Mi4T
  1. MongoDB - Setup and configuration
  2. Template Bulding Block:- C# TBB is used to get the component DCP in XMLformat after some changes . 
  3. Custom Storage Extension – A JPAComponentPresentationDAO based custom storage extension to manipulate the dynamic component presentations
  4. MongoDBIndexService- WCFRestFul Service which get invoked by custom storage extension and take DCP as input
  5. MongoDBSearchService - WCFRestFul service to get the data from MongoDB and take input query in JSON format .

SETUP

Below are various setup steps
  1. CMS Setup
    • Copy and paste the templating building block (TBB) to a location on your Tridion CM Server 
    • Upload MI4TIndexing.Templating.dll TBB to Tridion CMS 
    • Create a Component Template with following attributes
      • Output Format – XML Fragment
      • Add GetComponentAsXML TBB ,Publish Binaries in Package, Link Resolver and Cleanup Template 
  2. MongoDB Setup
  3. Content Delivery Setup
    • Open the cd_storage_config.xml Storage Configuration file from the /bin/config folder and add following node under the Storages section:
      • <StorageBindings><Bundle src="CustomStorageDAOBundles.xml"/></StorageBindings>
    • Copy and paste CustomStorageConfig.xml file to change the value of following nodes
      • ServiceEndPoint - URL of the IndexService
      • TemplateIdToIndex - Tcm Id Of component Template which we have created in step 1 CMS setup. 
    • Copy the CustomStorageDAOBundles.xml XML file in the Content Delivery /bin/config folder
  4. MongoDB Index Service
    • Copy and paste MongoDBIndexService on your server host it in IIS 
    • Copy the configuration folder as well .
      • You can update the log files path from Logging.config inside the configuration folder
    • Update the path of configuration\logging.config file in web.config of index service
  5. MongoDB Search Service
    • Copy and paste MongoDBIndexService on your server host it in IIS 
    • Copy the configuration folder as well 
      • Update the log files path in Logging.config inside the configuration folder
    • Update the path of configuration\logging.config file in web.config of index service
  6. Index service support Publishing ,Re-publishing and un-publishing
  7. Search service will get the data in JSON format. 

To test the index and search services

  1. Index service

    1. You can use fiddler for debugging 
    2. I have created a sample schema article
      1. title
      2. description
      3. imageurl
    3. Run index service on fiddler
      • http://localhost/Service1.svc/AddDocument
      • Input JSON which will generated by custom storage 
      • {"ServicePayload":{"DCP":"<Content xmlns:xsi='http://www.w3.org/2001/XMLSchema-instance' xmlns:xsd='http://www.w3.org/2001/XMLSchema' xmlns:xlink='http://www.w3.org/1999/xlink' xmlns:tcm='http://www.tridion.com/ContentManager/5.0' Title='Copy of Demo Of MongoDB' Id='tcm:2073-13667'><title>Demo Of MongoDB</title><description><![CDATA[Demo of component creation in Tridion using MongoDB]]></description><imageurl>/images/demo.png</imageurl><publication Id='tcm:0-2073-1' Title='03 Content Master' /></Content>","LanguageInRequest":"en"}} 
    4. Execute this request in Fidler and verify the results returned
    5. Check in mongoDB as well inside the collection\document which you have created and entered in the index service 
    6. Result as 0 for success and 1 for failure
      •  {"ResponseContext":{"EnvironmentContext":null,"FaultCollection":[]},"ServicePayload":{"ErrorMessage":"","Result":0}}
    7. Test this by publishing the Component as well 
    8. Log File Where Response Result :0 means successfully published and data is index in MongoDB
  1. Search Service
    1. You use the fiddler for debugging 
    2. Run the search service on fiddler
      • Url:- http://localhost/SearchSvc.svc/GetContentFromMongoDB
      • {"ServicePayload":{"ContentType":"Content","Filters":[{"Key":"ItemURI","Value":"tcm:2073-13667"},{"Key":"publicationID","Value":"tcm:0-2073-1"}]}
      • Execute the request in fiddler and verify the results returned
      • This service will get you the result based on Filters you provide and  use //MAP/REDUCE  
      •  //Map/Reduce            
                            var map =
                                "function() {" +
                                "    for (var key in this) {" +
                                "        emit(key, { count : 1 });" +
                                "    }" +
                                "}";

                            var reduce =
                                "function(key, emits) {" +
                                "    total = 0;" +
                                "    for (var i in emits) {" +
                                "        total += emits[i].count;" +
                                "    }" +
                                "    return { count : total };" +
                                "}";
      • Output will be in the JSON format 
      • { "_id" : ObjectId("5741539eef525465db9eb131"), "title" : "Demo Of MongoDB", "description" : "Demo of component creation in Tridion using MongoDB", "imageUrl" : "/images/demo.png", "ItemURI" : 
      • "tcm:2073-13667", "publicationID" : "tcm:0-2073-1" }
      • You can have AND,OR and NOT as a logical operators to query the data from MongoDB
        • {"ServicePayload":{"ContentType":"Content","Filters":[{"Key":"ItemURI","Value":"tcm:278-13667"},{"Key":"publicationID","Value":"tcm:0-278-1"}],"MongoDatabase":"customerDatabase","Table":"article","QueryType":"OR"}
  2. You can download the code here

Happy Coding & Keep Sharing !!!