Powered By Blogger

Saturday, July 13, 2019

Kibana





For kibana

It is the tool developed using nodejs

To change port go to /Users/z002qhl/Documents/ELK/Kibana/kibana-7.2.0-darwin-x86_64/config/kibana.yml
And change port to server.port: 5601


To start the server  go tp /Users/z002qhl/Documents/ELK/Kibana/kibana-7.2.0-darwin-x86_64/bin and run
./kibana




For elastic search

To change port go to /Users/z002qhl/Documents/ELK/Elastic/elasticsearch-7.2.0/config/elasticsearch.yml
And change port to http.port: 9200

If we want to change the jvm options change in jvm.options file

To start the server  go tp /Users/z002qhl/Documents/ELK/Elastic/elasticsearch-7.2.0/bin and run
./elasticsearch

These are the default ports



check what process is running in which port


check what process is running in which port
fasfasf:~$ netstat -anp tcp | grep LISTEN

Tuesday, July 2, 2019

Kafka Producer in main program


package com.tgt.hdfsToKafka;

import java.io.BufferedReader;
import java.io.FileInputStream;
import java.io.IOException;
import java.io.InputStream;
import java.io.InputStreamReader;
import java.nio.file.Files;
import java.nio.file.Paths;
import java.util.Properties;
import java.util.concurrent.ExecutionException;

import org.apache.kafka.clients.producer.KafkaProducer;
import org.apache.kafka.clients.producer.Producer;
import org.apache.kafka.clients.producer.ProducerConfig;
import org.apache.kafka.clients.producer.ProducerRecord;
import org.apache.kafka.common.serialization.StringSerializer;

public class TestProducer {

public static void main(String[] args) throws IOException {
try {
Producer producer = createProducer();
String message = readFile();

ProducerRecord record = new ProducerRecord<>("scm-testing",
"testmessagekey", message);

producer.send(record).get();
} catch (InterruptedException | ExecutionException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}

private static Producer createProducer() {
Properties props = new Properties();
props.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG,
"broker:9093");
props.put(ProducerConfig.CLIENT_ID_CONFIG, "hdfsToKafka");
props.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class.getName());
props.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, StringSerializer.class.getName());
// props.put("ssl.truststore.location",
// "/Users/basan/Documents/JavaProjects/hdfsToKafka/src/main/resources/client.truststore.jks");
props.put("ssl.truststore.location", "/Users/basan/Desktop/asfasf/kafka_cert/dddddddclient.truststore.jks");
props.put("ssl.truststore.password", "jansfjjasjfjasjfjj);
// props.put("ssl.keystore.location",
// "/Users/basan/Documents/JavaProjects/hdfsToKafka/src/main/resources/ffods-stg.target.com.jks");
props.put("ssl.keystore.location", "/Users/basan/Desktop/asfasf/kafka_cert/serversfasfasfasfasf.keystore.jks");
props.put("ssl.keystore.password", "nasf as f n n n");
props.put("security.protocol", "SSL");
props.put("ssl.protocol", "TLSv1.2");
return new KafkaProducer<>(props);
}
static String readFile() throws IOException{
String contents = new String(Files.readAllBytes(Paths.get("/Users/basan/Documents/workspace-sts-3.8.4.RELEASE/hdfsToKafka/sample.json")));

return contents;
/*InputStream is = new FileInputStream("/Users/basan/Documents/workspace-sts-3.8.4.RELEASE/hdfsToKafka/sample.json");
BufferedReader buf = new BufferedReader(new InputStreamReader(is));
       
String line = buf.readLine();
StringBuilder sb = new StringBuilder();
       
while(line != null){
  sb.append(line);
  line = buf.readLine();
}
       
String fileAsString = sb.toString();
System.out.println("Contents : " + fileAsString);
return fileAsString;*/
}

}

Monday, July 1, 2019

alfresco smart folder

Below are the steps
  1. upload the attached json file having smart folder definition to smart folder template and change type to smart folder template. 
  2. Create the site and under documentLibrary create folder
  3. Create contents by applying the aspect 
  4. On the folder where we would like to see the smart folder apply aspect System smart folder
  5. Click on edit properties and select template json file uploaded
  6. We should be able to see below structure.