Skip to main content

Posts

Showing posts from July, 2018

customise your linux terminal prompt

This is how my final terminal should look like nuwans:mediation-dep-gw (master) $ I have removed lengthy host name, very long directory structure and added git branch This can be done via .bashrc file located in your local home directory. in bashrc file PS1 variables handles the terminal prompt. parse_git_branch() {     git branch 2> /dev/null | sed -e '/^[^*]/d' -e 's/* \(.*\)/ (\1)/' } if [ "$color_prompt" = yes ]; then     PS1='\[\033[01;32m\]\u:\[\033[01;34m\]\W\[\033[00m\]$(parse_git_branch) $ ' else     PS1='\u:\W\$ ' fi Paste above code to the bashrc file to update the existing code. What I have done? PS1='\[\033[01;32m\]\u:\[\033[01;34m\]\W\[\033[00m\]$(parse_git_branch) $ ' PS1 - sets the terminal pattern used by linux os. \[\033[01;32m\] - setting the colour \u - user name \W - current folder $(parse_git_branch) - will read git branch  info and if its not git folder empty value is retured.

Reverse proxy vs forward proxy

proxy is a server stands in between client and the original server. this get the reqest and sends a new request to the other end. and retrieves the response and creaetes a new response Client -> Proxy -> server Forward proxy this proxy act as the client for the  server. so server doesn't know the real client. real client is hidden from server. reverse proxy this proxy act as server for the client. so real server is hidden from the client.

ELK Logstash Filter writing for response time, request and response

ELK Logstash Filter writing for response time, request and response correlating Now my elk setup is up and running and reads a log file and shows it on Kibana dashboard. basically there is no business logic was implemented. The new business requirement was to write a csv file from a log file with request response and response time. and also show them on kibana. problem i faced here was request and response came as two separate events in the log file. so the requirement can be split as below. it should read the requests and responses from a file correlate them, as they not come in a single event calculate the response time create a csv file with request response and response time in same line As we have already configured our system to read log records from a file, there’s no extra work to do to get both requests and responses to elastic-search. we decided go with log-stash as it’s the tool recommended in elk stack to handle complex processing such as correlating events.