Saturday, 14 July 2018

SDL WEB and TLS 1.2 or higher

Are you still using the SSL/early TLS protocols? 


Do you work with partners or customers who haven’t yet started the migration away from SSL/early TLS to a more secure encryption protocol? It's time to say goodbye to SSL/early TLS and reducing the risk of being breached.

On 30 June 2018?

30 June 2018 is the deadline for disabling SSL/early TLS and implementing a more secure encryption protocol – TLS 1.1 or higher (TLS v1.2 is strongly encouraged) in order to meet the PCI Data Security Standard (PCI DSS).

What is TLS?

TLS stands for Transport Layer Security and is the successor to SSL (Secure Sockets Layer). However, both these terms are commonly thrown around a lot online and you might see them both referred to as simply SSL.  TLS provides secure communication between web browsers and servers. The connection itself is secure because symmetric cryptography is used to encrypt the data transmitted. The keys are uniquely generated for each connection and are based on a shared secret negotiated at the beginning of the session, also known as a TLS handshake. Many IP-based protocols, such as HTTPS, SMTP, POP3, FTP support TLS to encrypt data.

In my last project, we faced this issue where client's security and infra team enabled TLS 1.3 on servers and while implementing SDL WEB 8.5 we start getting this issue.

Issue because of TLS 1.3

After investigating and consulting it with SDL we've found the solution for this, SDL WEB 8.5 supports till 1.2 and for that, we need to make some adjustment in the registry. Below are the registry entry details that fixed the algorithm and TLS issue for us.

TLS 1.0
TLS 1.0 Client

TLS 1.1 Client

TLS 1.2 client

[HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\.NETFramework\v4.0.30319]"SchUseStrongCrypto"=dword:00000001
New Entry Required  

[HKEY_LOCAL_MACHINE\SOFTWARE\Wow6432Node\Microsoft\.NETFramework\v4.0.30319]"SchUseStrongCrypto"=dword:00000001
New Entry Required

Happy Coding and Keep Sharing !!!

Saturday, 23 June 2018

SDL Tridion with IBM Watson Tone Analyzer

The Tone Analyzer service analyzes the tone of input content.

One of the advancements of artificial intelligence and machine learning to content management systems is understanding the performance of content. Content creators can analyze how content has come across to readers, which tone resonates with which audience.

As a content creator and marketer, being able to perform tone analyzing is one of the best features powered by machine learning. IBM Watson Tone Analyzer service as it detects emotional, social, and language tones in written text. The Tone Analyzer provides insight on how the content is coming across to readers, and for each sub-tone, it provides a score between 0 and 1 if the score is 0.75 or higher is considered most accurate.

In this article, we are going to see how we can use this service with SDL Tridion to analyze the website content tone. I have created an ASP.NET MVC application and used it as a custom page in Tridion.

Custom Page

 In this application, we have three sections.
  • Read data based on URL we enter and it will analyze the content tone. 
  • Analyzed data based on the custom text "your own text".
  • And last is "Endpoint is exposed it will take the text as input and returns JSON ".

Let's see how it works with the website URL, here we need to enter the URL of our website or any link that we want to analyze and the application will read all its HTML text attributes and its content. In the given example I have configured H1, H2, H3, H4 and P tags we can configure other tags as well. So any URL that we enter, the application will scan that webpage and extract the data of all the above attributes and send it to IBM Tone service to analyze. For the demo, I scanned the Analyzer web application itself to see how the content has come across to readers.

Analyzed Content by URL

Next, is custom text, Here you can enter the text manually and understand the tone before we actually published it or entering in any component.

Your Own Custom Text
The last one is an Endpoint that will take text input and returns JSON data. So if you want to use this service you can use this endpoint directly.

Endpoint

Finally, this is also available on Docker I pushed this on my Docker account you can download the docker image and explore the Tone Analyzer service. This has some limitation since I am using Dev account here, only limited service calls are allowed.

Docker Image


Happy Coding and Keep Sharing !!!!

Saturday, 2 June 2018

SDL Tridion with IBM Watson Assistant

Well, in the past I've learned and worked on many CMS systems but got engaged with SDL Tridion and one thing I noticed that they all have all the common features available e;g. Targeting, Personalization, Experience manager (InLine Editing), Analytics, Content Management etc.

"It's time to make your CMS more intelligent."

 I would like to share my knowledge and findings on how we can use Artificial Intelligence and Machine Learning technique in SDL Tridion in the form of a ChatBot Assistant with the help of IBM Watson Assistant service. I've trained this BOT to help CMS user and answer most commonly and frequently asked questions.

Especially for non-technical users "Content Author" with their most frequently asked questions.  I trained this BOT to answer based on DXA implementation and on most common scenarios "it's still learning", but if you want you can educate it according to your CMS implementation.

  • "Who can give me access on CMS or any specific Publication "
  • "Where, I should create content as per the blueprint hierarchy"   
  • "Where I should create Templates"
  • "Where I should create schema"

To know more about IBM Watson Assistant please refer my blogs.


Today we are going to integrate IBM Waston Assistant sample ChatBot in SDL Tridion and for that, I have created an Alchemy Plugin. It as pretty simple and StraightForward. You can always extend the functionality, not limited to ChatBot only you will come to know once you start exploring it.

If you are new to Alchemy please visit us at WebStore and Alchemy4Tridion

Install Alchemy Plugin

Conversation between user and Watson

Happy coding and keep sharing !!!

Friday, 30 March 2018

SDL Tridion India Dev Summit 2018

The fourth year in a row, Yes SDL Tridion India community has organized fourth SDL Tridion India Summit on 23rd and 24th March 2018.

People from the different regions joined in this event and shared their knowledge, experiences with others.

I also got the opportunity to participate in this event as a speaker and share my experience and process on MVP selection process. I also took this opportunity to share my knowledge with other and presented How we can use Docker to host DXA application. Learn more

What is SDL Tridion MVP Award

  • Each year SDL Tridion Most Valuable Professional Awards recognize individuals with a passion for sharing their knowledge and expertise.
  • You can share your Knowledge by writing Blog Post create videos or help other community members on Tridion StackExchange.
  • You can become a most valued professional (MVP) by sharing your passion, knowledge.
  • Upload your videos about SDL Tridion.
  • Passion is more important.

Process

  • The nomination is open in December. To nominate you can send your details should include your blog post, videos URL, and TREX profile at  sdl.tridion.mvp@sdl.com
  • The selection panel will evaluate each nominee's voluntary contribution to the community over the past 12 months.
  • In the end, the MVP team is summarizing the results.
  • The MVP Award is usually announced at the Mid of February.

Some useful links related to SDL Tridion MVP Program

Like, As I mentioned I took this opportunity and shared my knowledge about how to run the  SDL DXA  in the Docker container.






Highlights of this year SDL Tridion India Dev Summit 


Welcome

Day 1, The first session was about Tridion 9 Roadmap


Kurt introduced Tridion Docs 


Followed by Alvin (Go To Market), Pankaj (Upgrade Strategies), and Venu (UDP and Content Mashups)




 Day 2, Nuno started the first session of Day 2 and giving insights about SDL Cloud features


Followed By
  • Rajesh Kumar from SDL shared thoughts on pre-sales demo.
  • Kurt started a demo on Tridion Docs
  • Sayantan from SDL explained the connectors
  • Priyank from Content Bloom gave a live demo of how to use Core Service PowerShell module.
  • Raj Kumar from Sapient demonstrated various Tridion caching techniques  
  • Pankaj presented his second session on UDP

It was a great experience meeting all of you at SDL Tridion India Dev Summit, Hope to see you again next year.

Saturday, 17 March 2018

RUN SDL DXA inside Windows Docker Container

In one of my previous blog post, we learned, How to run SDL DXA theming inside Docker. Today we are going to run SDL DXA in Window Docker. It is really quick and easy. To learn more about my previous blog post click here.

Steps to setup you SDL DXA-WEBAPP in Docker

  1. Download DXA latest version from SDL GITHUB account.
  2. Create a folder at any location, this path will be used as a physical location when you run the web-install.ps1.
  3. Go to the Root directory and create Dockerfile.
  4. Here, it is very important to understand that what all you need to have in your DockerFile. I faced couple of issues while I was working on this but managed to solve all of them.
  5. Once your SDL DXA is installed copy and paste below sample Dockerfile content in your file.
  6.  FROM microsoft/aspnet  
     RUN New-Item c:\sdldxaindocker -type directory  
     WORKDIR /sdldxaindocker  
     COPY ./DXADOCKER/ .  
     RUN Remove-WebSite -Name 'Default Web Site'   
     RUN New-Website -Name 'DXADOCKER' -Port 80 -PhysicalPath 'c:\sdldxaindocker'  -ApplicationPool '.NET v4.5'  
    
  7. Open PowerShell in Admin mode and navigate to root directory created in step 2.
  8. Run the following Docker commands.
    • docker build -t sdldxaindocker .     (refer ScreenShot 1)
    • docker run -d --name sdldxaindockerimage sdldxaindocker  (refer Screenshot 2)
    • docker inspect -f  "{{ NetworkSettings.Networks.nat.IPAddress }}" sdldxaindockerimage  (refer ScreenShot 3)
  9. Last step set-TtmWebsite and update the -BaseUrls
Advantages of using Docker.
  1. You can connect your source code with docker and with this it will keep your code in-sync/updated with your Docker container and solve your CI problems.
  2. You can run this container on any platform.
  3. Automate your build and test pipelines and accelerate your software development.
  4. Minimum Infrastructure.
  5. Load-Balancing can be managed easily.
  6. Improve Efficiency of DevOps.
  7. Easy Distribution, share, and configuration of applications.
  8. Easy and real-time scaling of application 

The output of your Docker commands.

ScreenShot 1
ScreenShot 2
Screenshot 3

Browse the URL http://172.24.179.102/




Happy coding and keep sharing !!!!

Monday, 15 January 2018

Monitor Performance of CM and CD servers using ElasticSearch, Kibana and Beats

Monitoring Server performance is a really important activity. This will help us to provide the uninterrupted, smooth and better end-user experience it will also help in forecasting the horizontal or vertical scaling server. Today we are going the see how we can use open source services called Beats, ElasticSearch and Kibana to monitor the performance of our servers. Before we move further let's talk about all these services.

  1. Beats:- Beats is the lightweight data shipper and send the data from multiple machines to Logstash or ElasticSearch. They all are lightweight services which collect the data from your servers and index them in ElasticSearch and with the help of Kibana we can represent that data in the pictorial view. Types of beats and how to install and configure 
  2. ElasticSearch: ElasticSearch is a search engine based on Lucene. This comes with extensible API supporting C#, JAVA, CURL, PHP etc. Learn more about Elasticsearch and how to install and configure Elasticsearch.
  3. Kibana:- Kibana is a visualization application that gets its data from Elasticsearch. It provides a customizable and user-friendly UI in which you can combine various widget types to create your own dashboards. The dashboards can be easily saved, shared, and linked. We can also add the charts embedded HTML in your web-application. 
Beats setup consists of:
Once you are doing with installation and configuration of all the pre-requisites now let see how its look when beats push the data into ElasticSearch and Kibana use that data for visualization.Make sure you have all the Beats service, ElasticSearch, and Kibana up and running.

Beats services provide many data points based on different activities is performed on the server.

Let's see some of them in detail.
  1. Beats services up and running
  2. CPU, Memory utilization
     
  3. HTTP monitoring of  Web URLs, Here I have configured the Staging and Live DXA site, you can configure CMS or CD Microservice URLs or any other web URLs.
  4. Windows logs
  5. Disk Utilization
Happy coding and keep sharing !!!

Saturday, 25 November 2017

Storage Extension Using ElasticSearch and Docker


The latest release of the ElasticSearch4Web8 framework is much more clean, easy in use, less configuration and setup steps are required.To know more about the previous release refer this link.

In the previous version of this framework, apart from CMS and Deployer setup steps, we were required to host custom WCF based Index service, install and configure ElasticSearch. In this release, I've moved all these on Docker.

So, now instead of downloading code host the Index service and ElasticSearch all we need to do is Docker Pull and you will have all the pre-requisites up and running.

Steps to setup latest version of this framework:-
  1. CMS setup
    • Copy and paste the templating building block (TBB) to a location on your SDL WEB 8 CM Server
    • Upload ESI4TIndexing.Templating.dll TBB to WEB 8 CMS using TcmUploadAssembly.exe
    • Create a Component Template with following attributes
      • Output Format – XML Fragment
      • Add GetComponentAsXML TBB ,Publish Binaries in Package, Link Resolver and Cleanup Template.
  2. To setup Elasticsearch run following docker commands.
    • docker pull hemkant/elasticsearch .
    • docker run -p 9200:9200 -m3gb hemkant/elasticsearch
  3. To setup generic, WCF IndexService run following docker commands.
    • docker pull hemkant/elasticsearchintegration4web8 .
    • docker run -p 83:83 hemkant/elasticsearchintegration4web8
  4. To verify all the docker containers are up and running run docker ps command
  5. All these containers are available on hub.docker.com
  6. The last step is setup Deployer service
    • Open the cd_storage_config.xml Storage Configuration file from the /bin/config folder and add following node under the Storages section:
      • <StorageBindings><Bundle src="CustomStorageDAOBundles.xml"/></StorageBindings>
    • Copy and paste CustomStorageConfig.xml file to change the value of following nodes
      • ServiceEndPoint - URL of the IndexService
      • TemplateIdToIndex - Tcm Id Of component Template which we have created in step 1 CMS setup. 
    • Copy the CustomStorageDAOBundles.xml XML file in the Content Delivery /bin/config folder

That's all we need. We have configured the ElasticSearch integration with SDL WEB  now publish the components and data will be stored in ElasticSearch.


Happy coding and keep sharing !!!