Sunday 5 September 2021

XML and Json Response Format using Jackson

There will be some scenarios, where you need to send both the XML and JSON output based on the calling client requests. This might become sometimes tricky if we are using the Message Converters other than Jackson provides the ability to generate both the XML and JSON. 


Follow the below steps:


Step:1 Maven dependency.


<dependency>

<groupId>com.fasterxml.jackson.dataformat</groupId>

<artifactId>jackson-dataformat-xml</artifactId>

<version>2.10.0</version>

</dependency>


Add the above dependency, once this is available the message convertor Jackson will be available.


Stpe:2 Define the outputs in the Controller.


@RestController

public class HelloController {

@RequestMapping(value = "/greet", consumes = MediaType.APPLICATION_JSON_VALUE, produces = {MediaType.APPLICATION_JSON_VALUE, MediaType.APPLICATION_XML_VALUE })

public GreetingPojo index() {

MessagePojo msg = new MessagePojo();

msg.setMessage("Greetings from Spring Boot!");

// GreetingPojo targetObj = new DozerBeanMapper().map(msg, GreetingPojo.class);

GreetingPojo targetObj = GreetMapper.INSTANCE.msgtoGreet(msg);

return targetObj;

}

}


Stpe:3 Request for Json/Xml



We need to update the Accept Header to get the application/xml or application/Json. It will generate the output accordingly.


The only Constraint is we need to pass the object from the controller as it. The Jackson takes everything.

Happy Learning !!!!


MapStruts

It's always tough for me to set the values for the POJO classes. If the POJO going to be complex it will kill my whole day. Hence I was googling and found some solutions to this can be seen here


In this post, we are going to see about the mapstruts.


What is Mapstruts?


MapStruct is a code generator that greatly simplifies the implementation of mappings between Java bean types based on a convention over configuration approach.


The generated mapping code uses plain method invocations and thus is fast, type-safe, and easy to understand.


hence by using Mapstruts not only simplifies our work of setting the POJO values also, removes the tight bonding between the code and mappings.


Also, this can be used in dto types, conversion of an object from one type to another where most of the properties remain same then we can use this mapstruts to achieve this.


The following steps are used for mapping.


Step:1 Maven dependency


<properties>

<java.version>1.8</java.version>

<org.mapstruct.version>1.4.2.Final</org.mapstruct.version>

<m2e.apt.activation>jdt_apt</m2e.apt.activation>

</properties>


<dependency>

<groupId>org.mapstruct</groupId>

<artifactId>mapstruct</artifactId>

<version>${org.mapstruct.version}</version>

</dependency>


<plugins>

<plugin>

<groupId>org.springframework.boot</groupId>

<artifactId>spring-boot-maven-plugin</artifactId>

</plugin>

<plugin>

<groupId>org.apache.maven.plugins</groupId>

<artifactId>maven-compiler-plugin</artifactId>

<version>3.8.1</version>

<configuration>

<source>1.8</source> <!-- depending on your project -->

<target>1.8</target> <!-- depending on your project -->

<annotationProcessorPaths>

<path>

<groupId>org.mapstruct</groupId>

<artifactId>mapstruct-processor</artifactId>

<version>${org.mapstruct.version}</version>

</path>

<!-- other annotation processors -->

</annotationProcessorPaths>

</configuration>

</plugin>

</plugins>

Step:2 Create Interface


package com.greet.user.mapper;


import org.mapstruct.DecoratedWith;

import org.mapstruct.Mapper;

import org.mapstruct.Mapping;

import org.mapstruct.Mappings;

import org.mapstruct.factory.Mappers;


import com.greet.user.GreetingPojo;

import com.greet.user.MessagePojo;

import com.greet.user.decorator.GreetDecorator;


@Mapper

@DecoratedWith(GreetDecorator.class)

public interface GreetMapper {


GreetMapper INSTANCE = Mappers.getMapper(GreetMapper.class);


@Mappings({ @Mapping(source = "message", target = "welcomeMessage"),

@Mapping(target = "greetings", constant = "I am From Mapstruts") })

GreetingPojo msgtoGreet(MessagePojo msgPojo);


}


@Mapper defines the mapper class, @Mappings used to map the values from the source object to target object. Then create a method msgtoGreet which converts  MessagePojo to GreetingPojo

object.


source are the properties from the MessagePojo and target is the properties from the GreetingPojo object.


Step:3 Create a decorator class


In case the objects need to make some changes we can use this decorator class to make those changes.


package com.greet.user.decorator;


import org.springframework.beans.factory.annotation.Autowired;


import com.greet.user.GreetingPojo;

import com.greet.user.MessagePojo;

import com.greet.user.mapper.GreetMapper;


public abstract class GreetDecorator implements GreetMapper {


@Autowired

private GreetMapper delegate;


public GreetDecorator(GreetMapper delegate) {

this.delegate = delegate;

}


@Override

public GreetingPojo msgtoGreet(MessagePojo msgPojo) {

GreetingPojo dto = delegate.msgtoGreet(msgPojo);

//add manipulations to the object here.

return dto;

}


}


Stpe:4 Invoking from the class.


call this where you have to convert the object.


GreetingPojo targetObj = GreetMapper.INSTANCE.msgtoGreet(msg);


This will convert the MessagePojo POJO to the GreetingPojo object.


Happy Learning!!!!


Thursday 29 July 2021

ORA-01882: timezone region not found in Docker

This Issue comes when the development environment and the DB Environment are different.

For Eg: I have my Local Box running in the IST Zone and the docker container in UTC Zone. Hence I arrived at this error.

To Resolve this Follow the below steps.

Create the file docker-compose.yml

version:'3'

services:
  app:
    build: .
    image: my-image
    ports:
      - "8084:8084"
    environment:
  - TZ="Asia/Kolkata"


In the above docker-compose file, I mentioned the environment variable TZ as "Asia/Kolkata". Thes Time zones can be found from the link.

Building and Starting the Docker Image

Use docker compose up

It will start the image without this Error.

Happy Solvings!!!!

PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target

The above Error states that the server you are accessing has some certificate, and you did not have it in your machine. You have to import this local, in the case of docker follow the below steps.


This post Assumes that you have valid certificates with you before proceeding. 



Starting Docker Images with PKI certificates

Happy Learning !!!!

Starting Docker Images with PKI certificates

Before starting this post, Make Sure you copy the certificate to the location where you have the docker file. Edit the following docker file according to your requirement. My Requirement was to copy the jar and start in docker.


Below is my DockerFile


FROM java:8

EXPOSE 8084

ADD ./target/my.jar my_docker.jar

USER root

COPY my.cer $JAVA_HOME/jre/lib/security

RUN \

    cd $JAVA_HOME/jre/lib/security \

    && keytool -keystore cacerts -storepass changeit -noprompt -trustcacerts -importcert -alias ldapcert -file my.cer

ENTRYPOINT ["java","-jar","my.jar"]


Once you copy the certificate and import it using the key tool it will import successfully.


After this build it and run using the port.


Building the Docker image


docker build -t  cert-sample .


Running the Docker image


docker run -d -p 8084:8084 cert-sample


Happy Learning!!!!

Unable to connect to server: x509: certificate signed by unknown authority

Today I was facing some strange issue, when I was trying to connect to the Kubernetes in remote sever got the error as "unable to connect to server: x509: certificate signed by unknown authority".


The Solution is make sure you disable the --insecure-skip-tls-verify  by following the posts.


Happy Solvings !!!

Accessing Remote Kubernetes server using the Kubectl

Below are the configuration required to make the kubectl, access the K8s running in a remote server. So Searched through the documentation and found the following good solution that is 100% working.


"This Post does not require any certificate or key to access the remote k8s."


Below are the syntax given by the K8s Documentation. 


Syntax:


kubectl config set-cluster default-cluster --server=https://<host ip>:6443 --certificate-authority <path-to-kubernetes-ca> --embed-certs


kubectl config set-credentials <credential-name> --client-key <path-to-key>.pem --client-certificate <path-to-cert>.pem --embed-certs


kubectl config set-context default-system --cluster default-cluster --user <credential-name>


kubectl config use-context default-system


Exmples:


kubectl config set-cluster my-cluster --server=https://1.2.3.4 --insecure-skip-tls-verify=true


kubectl config set-credentials my-credentials [--token=bearer_token] or [--username=username] [--password=password]


In My case it was token. hence I used Token you can use the username and password also.


kubectl config set-context my-system --cluster my-cluster --user my-credentials --namespace=default


kubectl config use-context my-system


After making these changes the context will be switched to the my-system. then when you execute the kubectl it will give results from the remote k8s. In case you need to switch. Use the below command to switch to local or other remote repositories. This information will be available in the .kube/config file. To Access go to Run (win+R) and type .kube and hit enter here you can see this file.


kubectl config use-context my-system


Happy Learning !!!!

Sunday 18 July 2021

Dozer Mapping

Dozer is a Java Bean to Java Bean mapper that recursively copies data from one object to another. Typically, these Java Beans will be of different complex types.


Dozer supports simple property mapping, complex type mapping, bi-directional mapping, implicit-explicit mapping, as well as recursive mapping. This includes mapping collection attributes that also need mapping at the element level.


We can see the implementation of the dozer in this post. This way is not recommended since the update is not coming from the developer's side.


Here the Mapping is very easy.


Just add the maven dependency. then identify the source and destination classes. The add the mapping in the source class then this will automatically convert the object.


Consider the below example.


Maven dependency.


             <dependency>

<groupId>net.sf.dozer</groupId>
<artifactId>dozer</artifactId>
<version>5.5.1</version>

</dependency>


Source Object:


package com.greet.user;


import org.dozer.Mapping;


public class MessagePojo {

@Mapping("welcomeMessage")

String message;


public String getMessage() {

return message;

}


public void setMessage(String message) {

this.message = message;

}


public MessagePojo() {

}


}



Destination Object:


package com.greet.user;


public class GreetingPojo {

String welcomeMessage;

public GreetingPojo() {

super();

// TODO Auto-generated constructor stub

}


public GreetingPojo(String welcomeMessage, String greetings) {

super();

this.welcomeMessage = welcomeMessage;

this.greetings = greetings;

}


public String getWelcomeMessage() {

return welcomeMessage;

}


public void setWelcomeMessage(String welcomeMessage) {

this.welcomeMessage = welcomeMessage;

}


public String getGreetings() {

return greetings;

}


public void setGreetings(String greetings) {

this.greetings = greetings;

}


String greetings;


}


Here the Message Object in Message Pojo needs to be mapped to the GreetingPojo welcomeMessage. Hence I have used the mapping in the message POJO with the field name.


Then In the class where we are converting needs the below code.


Mapping:


MessagePojo msg = new MessagePojo();

msg.setMessage("Greetings from Spring Boot!");

GreetingPojo targetObj = new DozerBeanMapper().map(msg, GreetingPojo.class);


where msg is the source object and GreetingPojo class. This converts the messagePojo to the GreetigPojo. 


Happy Learning!!!!

Object Mappers in Java

What is the mean by object mapper? 


As the name suggests mapping the data from one object to another? Consider the situation where you want to convert the object from one form to another. If the Object is simpler, then we can iterate and save it in the new object where else in case of the complex objects it is tougher to save it, because you need to iterate all the objects and save it. Which will add more adhoc. In order to overcome this, we have the Object mapper APIs.


There are various Object Mapper APIs are available, which we can see in the individual posts. Some are


1. Dozer.

2. MapStruts

3. ModelMapper 

4. Apache Bean Utils.


Using these API we can convert from one object to another with minimal effort.

Persistent Volumes (PV) Storing Files in Kubernetes

We can go through the definition of PV and PVC First. 


A PersistentVolume (PV) is a piece of storage in the cluster that has been provisioned by an administrator or dynamically provisioned using Storage Classes. It is a resource in the cluster just like a node is a cluster resource. 


A PersistentVolumeClaim (PVC) is a request for storage by a user. It is similar to a Pod. Pods consume node resources and PVCs consume PV resources. Pods can request specific levels of resources (CPU and Memory). Claims can request specific size and access modes (e.g., they can be mounted ReadWriteOnce, ReadOnlyMany, or ReadWriteMany, see AccessModes).


These are the definitions from the K8s, In short, then Persistent volumes provide the space where we can store the files required for the functioning of our application.


Suppose consider my requirement. Where I need to store the huge shell scripts files, then I need to use them for the cron job to trigger. A persistent volume is one of the ways to do it.

This is one of the approaches to achieve it, there are also other ways where we can achieve this.


We cannot access the persistent volume without the persistent claim. Hence while creating the Persistent volume, we are also asked to create the Persistent claim also.


This Persistent volume remains even after the Pod is deleted.


Step:1 Persistent Volume creation in K8's.


apiVersion: v1

kind: PersistentVolume

metadata:

  name: scripts-pv-volume

  labels:

    type: local

spec:

  storageClassName: manual

  capacity:

    storage: 10Gi

  accessModes:

    - ReadWriteOnce

  hostPath:

    path: "/opt/data"

Step:2 Create the Persistent Volume claim


apiVersion: v1

kind: PersistentVolumeClaim

metadata:

  name: scripts-pv-claim

spec:

  storageClassName: manual

  accessModes:

    - ReadWriteOnce

  resources:

    requests:

      storage: 3Gi

    

Step:3 Use the Storage.


Copy the script files to the "/opt/data" persistent volume by the below command.


kubectl cp welcome.ksh default/mypod:/opt/data


where default is the namespace.

my pod is the name of my pod.

/opt/data is the path where this file needs to be copied.


apiVersion: batch/v1beta1

kind: CronJob

metadata:

  name: welcome-jobs

spec:

  replicas: 2

  selector:

    matchLabels:

      app: welcome-jobs

  template:

    metadata:

      labels:

        app: welcome-jobs

    spec:

      volumes:

        - name: scripts-pv-storage

          persistentVolumeClaim:

            claimName: scripts-pv-claim

      containers:

        - name: scripts-pv-container

          image: busybox

  command: ["/opt/data/welcome.ksh"]

          volumeMounts:

            - mountPath: "/opt/data"

              name: scripts-pv-storage


This will execute the Script welcome.ksh from the location /opt/data.


Happy Learning !!!!


Kubernetes read local docker images

Follow the below steps to read the docker images from local instead of pulling them every time from the docker hub. By default, it always reads the images from the docker hub.


This saves us a lot of time by reducing the time to push to the docker hub. It takes a lot of time to push the image from local and tags it.


There are two steps involved.


Step:1


Open the command prompt in admin mode and execute the below command.


C:\Users\Syed>minikube docker-env


Once you execute the command "minikube docker-env" you will see the following output. 


SET DOCKER_TLS_VERIFY=1

SET DOCKER_HOST=tcp://127.0.0.1:32770

SET DOCKER_CERT_PATH=C:\Users\Syed\.minikube\certs

SET MINIKUBE_ACTIVE_DOCKERD=minikube

REM To point your shell to minikube's docker-daemon, run:

REM @FOR /f "tokens=*" %i IN ('minikube -p minikube docker-env') DO @%i

Just Copy the Last line after REM and execute in the same command prompt. 


C:\Users\Syed>@FOR /f "tokens=*" %i IN ('minikube -p minikube docker-env') DO @%i


After making this change, the local docker images will be visible to the K8's.


Step:2


In the Yaml file of the K8's make sure that the image pulls policy to be "Never". Point this file to the local docker build name and the tag. 


eg: imagePullPolicy: Never


Once you do the above two steps, then from next time make changes to the docker file in local, build it and see the changes in the K8's.


Happy Learning!!!!

Saturday 10 July 2021

Generating the Pojo Classes automatically

There are scenarios, where we need to add the new attributes often, then the creation of POJO automatically will be helpful to you. You No need to make a lot of work, just do some changes it will create it. 

As far as I know, there are two ways we can generate these Pojo Classes.  

  • Using org.jvnet.jaxb2.maven2 
  • Using org.codehaus.mojo 

1. Generating through the org.jvnet.jaxb2.maven2 

The org.jvnet.jaxb2.maven2:maven-jaxb2-plugin, the most advanced and feature-full Maven plugin for XML Schema compilation. 


This Maven plugin wraps and enhances the JAXB Schema Compiler (XJC) and allows compiling XML Schemas (as well as WSDL, DTDs, RELAX NG) into Java classes in Maven builds. 


In order to use this, we need to have the XSD with us, through which we can generate the POJO classes. 


1.Create a directory in src/main/resources/XSD and copy the XSD file there. 

2.Add the Dependency there as like below. 

<plugin>

   <groupId>org.jvnet.jaxb2.maven2</groupId>

   <artifactId>maven-jaxb2-plugin</artifactId>

   <version>0.12.1</version>

   <executions>

      <execution>

         <id>generate</id>

         <goals>

            <goal>generate</goal>

         </goals>

      </execution>

   </executions>

   <configuration>

      <generatePackage>com.searchendeca.main.pojo</generatePackage>

      <generateDirectory>${project.basedir}/src/main/java</generateDirectory>

      <schemaDirectory>src/main/resources/xsd</schemaDirectory>

      <schemaIncludes>*.xsd</schemaIncludes>

   </configuration>

</plugin>

Here we need to specify where the files will be generated and the package it needs to generate then do alt+F5 which is Maven Refresh congratulations your POJO classes are generated automatically. 


2. Generating through the org.codehaus.mojo 

This plugin runs the XJC binding compiler from the JAXB distribution and integrates XJC’s configuration properties into a Maven project. 

Add the following dependency. 

<plugin>

   <groupId>org.codehaus.mojo</groupId>

   <artifactId>jaxb2-maven-plugin</artifactId>

   <version>2.4</version>

   <executions>

      <execution>

         <id>xjc</id>

         <goals>

            <goal>xjc</goal>

         </goals>

      </execution>

   </executions>

   <configuration>

      <sources>

         <source>src/main/resources/xsd/sample_CustomersOrders.xsd</source>

         <source>src/main/resources/xsd</source>

      </sources>

      <outputDirectory>src/main/java</outputDirectory>

      <!-- The package of your generated sources -->

      <packageName>com.searchendeca.main.pojo</packageName>

      <clearOutputDir>true</clearOutputDir>

      <addGeneratedAnnotation>false</addGeneratedAnnotation>

   </configuration>

</plugin>

 

If you are not specifying the output directory it will generate in the target folder. 

From the above two methods, I don’t see any difference apart from the groupId, it is to us which we can adopt for our project.  There are some tags different for specifying the output directory and package files etc. Again it's your call to adopt anyone, the second approach I see a lot of information on the internet.

Find the whole here.  

Happy Learning!!!!



 

Monday 5 July 2021

Create Xml Using the Stax Processor

StAX is a standard XML processing API that allows you to stream XML data from and to your application.


This API is better than the DOM parser in the case of Performance. It does not load the whole document to the memory like the DOM parser. SAX is a push API where else Stax is the pull API. Using the SAX for the creation of the document is not recommended.


The Below code is used to generate the below XML.



package com.searchendeca.main;

import java.io.StringWriter;


import javax.xml.stream.XMLOutputFactory;

import javax.xml.stream.XMLStreamException;

import javax.xml.stream.XMLStreamWriter;


public class StaxParserMain {


private void createStudentElements(XMLStreamWriter xmlStreamWriter) {

try {

createStreamWriter("FirstName", "Syed", xmlStreamWriter);

createStreamWriter("LastName", "Ghouse", xmlStreamWriter);

createStreamWriter("City", "Salem", xmlStreamWriter);

} catch (XMLStreamException e) {

// TODO Auto-generated catch block

e.printStackTrace();

}

}

private void createStreamWriter(String key, Object value, XMLStreamWriter xmlStreamWriter)

throws XMLStreamException {

if (value != null && value != "") {

if (value instanceof String) {

xmlStreamWriter.writeStartElement(key);

xmlStreamWriter.writeCharacters((String) value);

xmlStreamWriter.writeEndElement();

}

} else {

xmlStreamWriter.writeEmptyElement(key);

}


}


public static void main(String args[]) {

StringWriter stringwriter = new StringWriter();

XMLOutputFactory xmloutputfactory = XMLOutputFactory.newInstance();

StaxParserMain stax = new StaxParserMain();

try {

XMLStreamWriter xmlStreamWriter = xmloutputfactory.createXMLStreamWriter(stringwriter);

xmlStreamWriter.writeStartDocument();

xmlStreamWriter.writeStartElement("Student");

stax.createStudentElements(xmlStreamWriter);

xmlStreamWriter.writeEndElement();

xmlStreamWriter.writeEndDocument();

xmlStreamWriter.flush();

xmlStreamWriter.close();

String xmlString= stringwriter.getBuffer().toString();

System.out.println(xmlString);


} catch (XMLStreamException e) {

// TODO Auto-generated catch block

e.printStackTrace();

}

}


}


The whole project can be found in Github here.


Happy Learning !!!

Create XML Using DOM Parser in JAVA

The Document Object Model (DOM) is an official recommendation of the World Wide Web Consortium (W3C). DOM reads the entire document and will be useful in case of the size of the XML is small. Performance-wise this is slow compared to other parsers since it loads the entire document. We can Perform the Operations using the DOM API.Its Stays in Tree Structure.


We can construct the Following XML using the DOM Parser.



We need to understand the following to get started.

The Node interface is the primary datatype for the entire Document Object Model. It represents a single node in the document tree. 

The Element interface represents an element in an HTML or XML document. Elements may have attributes associated with them; since the Element interface inherits from Node, the generic Node interface attribute attributes may be used to retrieve the set of all attributes for an element. 

In order to use the dom parser include the following dependency in the maven.

 <dependency>

   <groupId>xml-apis</groupId>

   <artifactId>xml-apis</artifactId>

   <version>1.4.01</version>

</dependency>


In the case of the Spring boot project, this should be included as part of the starter package itself I guess.


package com.searchendeca.main;

import java.io.StringWriter;


import javax.xml.parsers.DocumentBuilder;

import javax.xml.parsers.DocumentBuilderFactory;

import javax.xml.parsers.ParserConfigurationException;

import javax.xml.transform.OutputKeys;

import javax.xml.transform.Transformer;

import javax.xml.transform.TransformerConfigurationException;

import javax.xml.transform.TransformerException;

import javax.xml.transform.TransformerFactory;

import javax.xml.transform.dom.DOMSource;

import javax.xml.transform.stream.StreamResult;


import org.w3c.dom.Document;

import org.w3c.dom.Element;

import org.w3c.dom.Node;


public class DomParserMain {


private Element createStudentElement(Document doc, Element rootElement) {

rootElement.appendChild(createElements(doc, "FirstName", "Syed"));

rootElement.appendChild(createElements(doc, "LastName", "Ghouse"));

rootElement.appendChild(createElements(doc, "City", "Salem"));

return rootElement;

}


private Node createElements(Document doc, String name, String value) {

Element node = doc.createElement(name);

if (value != null && !value.isEmpty()) {

node.appendChild((doc.createTextNode(value)));

}

return node;

}


public static void main(String args[]) throws ParserConfigurationException {


DocumentBuilderFactory dbFactory = DocumentBuilderFactory.newInstance();

DocumentBuilder dBuilder;

String sampleXml;

DomParserMain domParser = new DomParserMain();

dBuilder = dbFactory.newDocumentBuilder();

Document doc = dBuilder.newDocument();

Element rootElement = doc.createElement("Student");

doc.appendChild(rootElement);

rootElement = domParser.createStudentElement(doc, rootElement);

TransformerFactory factory = TransformerFactory.newInstance();

Transformer transformer = null;

try {

transformer = factory.newTransformer();

StringWriter writer = new StringWriter();

try {

transformer.setOutputProperty(OutputKeys.INDENT, "yes");

transformer.transform(new DOMSource(doc), new StreamResult(writer));

sampleXml = writer.getBuffer().toString();

System.out.println(sampleXml);

} catch (TransformerException e) {

// TODO Auto-generated catch block

e.printStackTrace();

}

} catch (TransformerConfigurationException e) {

// TODO Auto-generated catch block

e.printStackTrace();

}

}


}



The above code creates the XML in the desired output.


The Whole Project can be found in the Github link here.


Happy Learning!!!!!

Tuesday 29 June 2021

Running Batch Files in Kubernetes (KSH Files)

 Check out the files from the Github here.


Navigate to the Folder location where the checkout is done. Execute the Below commands. 


C:\Users\Syed\Hello-K8s_Job> kubectl create configmap hello --from-file=hello.ksh

configmap/hello created


It creates the config map from the Script provided.


C:\Users\Syed\Hello-K8s_Job>kubectl apply -f deployment.yaml

cronjob.batch/hello-job created


creates the cron job with the deployment.yml


Access the minikube Dashboard.




In the Cron Jobs tab, we can see the Job Created name "hello-job".


Once if you click the job, you will find the below tab as the Active and Inactive jobs. All the Jobs Executing Currently will be in the Active Jobs, Finished Jobs will be in the Inactive Jobs. 


In case, If you need to trigger the job manually then you need to press the play button in the right side top the title bar.





When you look inside the Logs of the Jobs executed we can see the Loggins added.




You can apply the same procedure for the complex KSH scripts as well.


Happy Learning.!!!!

Thursday 17 June 2021

Deploying SpringBoot Struts 2 Integration Project in Docker

 Follow my previous post and create the sample spring struts integration project from here.


Create a Docker File: Dockerfile


FROM tomcat:latest

ADD target/SpringStrutsDemo-0.0.1-SNAPSHOT.war /usr/local/tomcat/webapps/

EXPOSE 8080

CMD ["catalina.sh","run"]


Build and tag the Docker File:spring-strutsdemo


C:\Users\Syed\Spring-workspace\SpringStrutsDemo>docker build -t spring-strutsdemo .

Sending build context to Docker daemon  58.75MB

Step 1/4 : FROM tomcat:latest

 ---> 5505f7218e4d

Step 2/4 : ADD target/SpringStrutsDemo-0.0.1-SNAPSHOT.war /usr/local/tomcat/webapps/

 ---> a30a842ce761

Step 3/4 : EXPOSE 8080

 ---> Running in 35d616d2803f

Removing intermediate container 35d616d2803f

 ---> 2c848691227a

Step 4/4 : CMD ["catalina.sh","run"]

 ---> Running in 270c9c8d4b5d

Removing intermediate container 270c9c8d4b5d

 ---> f7b915b47c1f

Successfully built f7b915b47c1f

Successfully tagged spring-strutsdemo:latest


Run the docker image from the tag 


docker run -p 8080:8080 spring-strutsdemo

This will start in 8080 port. In case if it is not started well there could be an issue with the java version used in the docker or the project is not properly generated and war is invalid.

Access it using 

http://localhost:8080/SpringStrutsDemo-0.0.1-SNAPSHOT/message.action

where SpringStrutsDemo-0.0.1-SNAPSHOT is the context. 





Happy Learning!!!!



Spring Boot Struts 2 Integration

InOrder to Integrate Spring boot with the struts 2 follow the below Sample provided. This is a very basic project and gives you an understanding of the spring boot and Struts 2 Integration.

Create a Spring boot starter project with war packaging.


Maven Dependencies.


Add the Following dependencies to your maven project.

<dependencies>

<dependency>

<groupId>org.springframework.boot</groupId>

<artifactId>spring-boot-starter-web</artifactId>

</dependency>


<dependency>

<groupId>org.springframework.boot</groupId>

<artifactId>spring-boot-starter-tomcat</artifactId>

<scope>provided</scope>

</dependency>

<dependency>

<groupId>org.springframework.boot</groupId>

<artifactId>spring-boot-starter-test</artifactId>

<scope>test</scope>

</dependency>

<dependency>

<groupId>javax.servlet</groupId>

<artifactId>javax.servlet-api</artifactId>

</dependency>

<!-- https://mvnrepository.com/artifact/javax.servlet/jsp-api -->

<dependency>

<groupId>javax.servlet</groupId>

<artifactId>jsp-api</artifactId>

<version>2.0</version>

<scope>provided</scope>

</dependency>


<!-- https://mvnrepository.com/artifact/javax.servlet/servlet-api -->

<dependency>

<groupId>javax.servlet</groupId>

<artifactId>servlet-api</artifactId>

<version>2.5</version>

<scope>provided</scope>

</dependency>


<!-- https://mvnrepository.com/artifact/org.apache.struts/struts2-core -->

<dependency>

<groupId>org.apache.struts</groupId>

<artifactId>struts2-core</artifactId>

<version>2.5.26</version>

</dependency>


<!-- https://mvnrepository.com/artifact/org.apache.struts/struts2-spring-plugin -->

<dependency>

<groupId>org.apache.struts</groupId>

<artifactId>struts2-spring-plugin</artifactId>

<version>2.5.26</version>

</dependency>


<dependency>

<groupId>org.apache.struts</groupId>

<artifactId>struts2-java8-support-plugin</artifactId>

<version>2.5.2</version>

</dependency>

</dependencies>


Create the Action Class: GreetUserAction.java


package com.searchendeca.demo.action;

import com.opensymphony.xwork2.ActionSupport;

public class GreetUserAction extends ActionSupport {

private String message;

    public String getMessage() {

return message;

}

public void setMessage(String message) {

this.message = message;

}

@Override

    public String execute() throws Exception {

        return SUCCESS;

    }

}

This class should extend the ActionSupport class and has the default method as the execute.


Create Struts Configuration File: Struts2Configuration.java


package com.searchendeca.demo.config;


import org.apache.struts2.dispatcher.filter.StrutsPrepareAndExecuteFilter;

import org.springframework.boot.web.servlet.FilterRegistrationBean;

import org.springframework.context.annotation.Bean;

import org.springframework.context.annotation.Configuration;


import javax.servlet.DispatcherType;


@Configuration

public class Struts2Configuration {

    @Bean

    public FilterRegistrationBean someFilterRegistration() {

        FilterRegistrationBean registration = new FilterRegistrationBean();

        registration.setFilter(new StrutsPrepareAndExecuteFilter());

        registration.addUrlPatterns("*.action");

        registration.setDispatcherTypes(DispatcherType.REQUEST, DispatcherType.FORWARD);

        registration.setName("StrutsPrepareAndExecuteFilter");

        return registration;

    }

}


This is the place we filter the struts URL. In the above sample, we are allowing only URLs with the *.action.


Create Struts File: struts.xml


<?xml version="1.0" encoding="UTF-8"?>

<!DOCTYPE struts PUBLIC

        "-//Apache Software Foundation//DTD Struts Configuration 2.5//EN"

        "http://struts.apache.org/dtds/struts-2.5.dtd">

<struts>

    <constant name="struts.devMode" value="true"/>

    <package name="basicStruts2" extends="struts-default">

        <action name="message" class="com.searchendeca.demo.action.GreetUserAction" method="execute">

        <param name="message">Welcome to SearchEndeca</param>

            <result name="success">/greetUser.jsp</result>       

             </action>

    </package>

</struts>

In this File we register the action and map to the result. In the above example we are mapping the success result to the /greetUser. also sending the parameters message to display in the jsp.


Create Jsp:  greetUser.jsp


<%@ page language="java" contentType="text/html; charset=ISO-8859-1"

    pageEncoding="ISO-8859-1"%>

<!DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN"

    "http://www.w3.org/TR/html4/loose.dtd">

<html>

<head>

<meta http-equiv="Content-Type" content="text/html; charset=ISO-8859-1">

<title>Greet User</title>

</head>

<body>

    <center>

        <h3>${message}</h3>

    </center>

</body>

</html>


ServletInitializer Class: ServletInitializer.java


package com.searchendeca.demo;

import org.springframework.boot.builder.SpringApplicationBuilder;

import org.springframework.boot.web.servlet.support.SpringBootServletInitializer;

public class ServletInitializer extends SpringBootServletInitializer {

@Override

protected SpringApplicationBuilder configure(SpringApplicationBuilder application) {

return application.sources(SpringStrutsDemoApplication.class);

}

}

SpringBoot Application Class :SpringStrutsDemoApplication.java

package com.searchendeca.demo;

import org.springframework.boot.SpringApplication;

import org.springframework.boot.autoconfigure.SpringBootApplication;

@SpringBootApplication

public class SpringStrutsDemoApplication {


public static void main(String[] args) {

SpringApplication.run(SpringStrutsDemoApplication.class, args);

}

}


After this execute the command mvn : clean, package create the war file deploy it to either tomcat or the default server.


In-Browser Navigate to http://localhost:8080/SpringStrutsDemo/message.action





It Produces the above output.

This Project is available in Git here.

Happy Learning!!!!

Saturday 29 May 2021

AtoZ Fitness BP Module

 This App has the Following Screens.

1. Registration.


2. Add BP Readings




3.View BP Readings.






4. Welcome Screen



5. Graphical View 



The Above Code can be found at the Github from the link here.






GraphView in Android

Add the following Gradle dependencies

The Following Code defines the Graph View of line Graph Series.

implementation group: 'com.jjoe64', name: 'graphview', version: '4.2.2'package com.atozfit.Activity;

import android.graphics.Color;
import android.os.Bundle;

import androidx.appcompat.app.AppCompatActivity;

import com.atozfit.R;
import com.atozfit.Service.AtoZBPService;
import com.atozfit.main.AtoZBPAttributes;
import com.jjoe64.graphview.DefaultLabelFormatter;
import com.jjoe64.graphview.GraphView;
import com.jjoe64.graphview.helper.StaticLabelsFormatter;
import com.jjoe64.graphview.series.DataPoint;
import com.jjoe64.graphview.series.LineGraphSeries;

import java.text.ParseException;
import java.text.SimpleDateFormat;
import java.util.ArrayList;
import java.util.Date;
import java.util.List;
import java.util.Locale;

public class GraphViewer extends AppCompatActivity {

@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_graph_view);
drawGraph();
getSupportActionBar().setDisplayShowTitleEnabled(false);
}

public void drawGraph(){
List<AtoZBPAttributes> data=null;
AtoZBPService bpService = new AtoZBPService();
LineGraphSeries<DataPoint> systolicSeries=null;
LineGraphSeries<DataPoint> diastolicSeries=null;
GraphView graph1 = (GraphView) findViewById(R.id.systolic_big);
StaticLabelsFormatter staticLabelsFormatter1 = new StaticLabelsFormatter(graph1);
SimpleDateFormat dateFormat = new SimpleDateFormat("MMM-dd", Locale.ENGLISH);
// data=bpService.fetchBasedOnDate(bpService.retrieveBPData());
data=bpService.retrieveBPData();
String[] labels=setStaticLabels(data);
try {
systolicSeries = new LineGraphSeries<DataPoint>(generateDataPoint("systolic",data));
} catch (ParseException e) {
e.printStackTrace();
}
systolicSeries.setDrawDataPoints(Boolean.TRUE);
systolicSeries.setColor( Color.RED);
systolicSeries.setDataPointsRadius(6);
try {
diastolicSeries = new LineGraphSeries<DataPoint>(generateDataPoint("diastolic",data));
} catch (ParseException e) {
e.printStackTrace();
}
diastolicSeries.setDrawDataPoints(Boolean.TRUE);
diastolicSeries.setDataPointsRadius(6);
diastolicSeries.setColor( Color.BLUE);
graph1.addSeries(systolicSeries);
graph1.getGridLabelRenderer().setVerticalAxisTitle("Systolic & Diastolic");
graph1.getGridLabelRenderer().setHorizontalAxisTitle("Date");
graph1.addSeries(diastolicSeries);
graph1.getViewport().setXAxisBoundsManual(true);
graph1.getViewport().setYAxisBoundsManual(true);
graph1.getViewport().setMinY(30);
graph1.getViewport().setMaxY(220);
graph1.getViewport().setScalable(true);
graph1.getViewport().setScrollable(true); // enables horizontal scrolling
graph1.getViewport().setScalableY(true); // activate horizontal and vertical zooming and scrolling
graph1.getViewport().setScrollableY(true); // enables vertical scrolling
graph1.getViewport().setDrawBorder(false);
graph1.getViewport().setBackgroundColor(Color.TRANSPARENT);
graph1.setTitleColor(R.color.purple_200);
staticLabelsFormatter1.setHorizontalLabels(labels);
graph1.getGridLabelRenderer().setLabelFormatter(staticLabelsFormatter1);
graph1.getGridLabelRenderer().setLabelFormatter(new DefaultLabelFormatter(){
@Override
public String formatLabel(double value,boolean isValueX){
if (isValueX) {
return dateFormat.format(new Date((long) value));
}
else{
return super.formatLabel(value, isValueX);
}
}
});
}

private DataPoint[] generateDataPoint(String type,List<AtoZBPAttributes> data) throws ParseException {
SimpleDateFormat dateFormat = new SimpleDateFormat("MMM-dd-yyyy", Locale.ENGLISH);
List<DataPoint> dataPointList = new ArrayList<>();
DataPoint[] values = new DataPoint[data.size()];
if(!data.isEmpty()){
for(int i=0;i<data.size();i++){
if(type.equals("systolic")) {
DataPoint v = new DataPoint(dateFormat.parse(data.get(i).getDate()), Double.parseDouble( data.get(i).getSystolic() ) );
values[i] = v;
}else{
DataPoint v = new DataPoint(dateFormat.parse(data.get(i).getDate()), Double.parseDouble( data.get(i).getDiastolic() ) );
values[i] = v;
}
}
}
return values;
}




}