Showing posts with label tridion. Show all posts
Showing posts with label tridion. Show all posts

Wednesday 4 December 2019

Using AWS Services to Monitor Tridion

When you have a very important and bulk publishing is going on and you want to monitor each state then AWS service is a great way of doing it. Recently, I had this opportunity to implement AWS services to monitor SDL Tridion publishing and Broker Database spike.  We need to monitor the publishing state and Broker DB connections limit and for that I used the following AWS services.

All these activity is required and becomes almost mandatory when yon have Huge INFRA to manage.
  1. AWS CloudWatch
  2. AWS SNS (Simple Notification Service)
  3. AWS lambda Function

To monitor the publishing I used the AWS Lambda function which is in python and some inline SQL script. Yes, just few lines of code gives you all the info.



This matrix is then used in the dashboard to generate the realtime GRAPH and Similarly, we have script for other publishing state which helps us in monitoring the progress of publishing. These scripts are very helpful when you have thousands of the items in queue and waiting for publishing.

Failed items

Published items

Next, is we need to implement the notification service to send the notification whenever the Broker Database DBconnectin limit reaches the higher side or more than expected so that we can take action pro-actively. To Implement this we used the AWS default Matrix and with the help of AWS SNS we are sending the notification, for notification you can use (EMAIL,SMS,HTTP,Notification etc) depending upon your requirement.


 You need to go to the CloudWatch--> Alarm and create a New Alarm. By Default in SQL Server the default DB connection is set to 0 which mean Unlimited, but using AWS CloudWatch you can monitor and can take pro.active steps when its starts increasing.   

Next is send the notification if the limit is crossed and for that we can use AWS SNS.
Configure SNS to send Notification. 

Where SNS is your notification service. We first need to create a Topics and based on Publisher and Subscriber model we can send the notification. Protocol supported by the AWS SNS to subscribe.

Protocol Available 

Or, you can configure the Auto Scaling of you EC2 instance, we only have notifications service configured but yes, we also have this options as well. It all depends on your requirements.

Auto Scaling option in case of Alarm 

We just saw how we can monitor SDL Tridion using AWS Service and takes pro-active steps. Configuring the AWS Service is pretty easy.


Happy Coding and Keep Sharing!!! 



Saturday 15 June 2019

Extending Content Delivery Storage in SDL WEB 8.5 - Part 1

In this blog, We are going to extend the Storage layer in SDL WEB 8.5 but first, we need to set up the project and will make sure that the custom code interacts with the default process.

Pre-Requisites
  1. Eclipse or you can use your favorite JAVA IDE. You can also follow my previous blog on how to set up the Project in Eclipse and generate custom JAR. [here]
  2. JAVA 8
  3. Default SDL JARs
We can customize the way existing items are stored by the Storage Layer using the DAO "Data Access Object" implementation pattern. We also need to extend JPAComponentPresentationDAO and implement ComponentPresentationDAO and this will allow us to extend the Storage Layer for Dynamic Component Presentation.

Step 1. Create a JAVA class file and add the following Namespaces.
import com.tridion.broker.StorageException;
import com.tridion.storage.ComponentPresentation;
import com.tridion.storage.dao.ComponentPresentationDAO;
import com.tridion.storage.persistence.JPAComponentPresentationDAO;
import com.tridion.storage.util.ComponentPresentationTypeEnum;
import java.io.UnsupportedEncodingException;
import java.util.Collection;
import javax.persistence.EntityManager;
import javax.persistence.EntityManagerFactory;
import org.springframework.context.annotation.Scope;
import org.springframework.stereotype.Component;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
Step 2.  Extends the class JPAComponentPresentationDAO and implement  ComponentPresentationDAO. We also need to insert a @Component and a @scope("prototype") statement between import statements and class definition and the following line of code.

Custom Storage Extension

Step 3. Create a storage DAO bundle config file.



Step 4. Copy this config file in the Deployer config folder and open the cd_storage config file in your favorite editor and add the following element.

Edit storage config file.

Step 5. Copy the Custom JAR file in the deployer and restart the deployer.

Custom JAR.


Step 6. We are done with code and configuration, now we need to publish the DCP and check the logs.

Logs

In the log file, we can see the custom log is available and it's returning the Component ID which means the code is interacting with the default process.


In the next blog, we will see the difference between other Methods, how our code will change on Publishing, Unpublishing and re-publishing and how to read DCP content, Until then.

Happy Coding and Keep Sharing !!!