Skip to main content

Posts

Showing posts from 2017

Slow CPU after resume / Ubuntu 16.04 LTS too slow after suspend and resume

you might experience a slow performance in ubuntu 16.04 after resuming back from a sleep or hibernate. reason for this is it doesn't move away from powersave mode automatically. intel_pstate driver is the one making this problem, so we have to disable it and use acpi-cpufreq  driver. first lets check weather your competer is on powersave.  cat /sys/devices/system/cpu/cpu*/cpufreq/scaling_governor If your answer is powersave we have to move it to performance state. get your cpu info by cat /proc/cpuinfo | grep MHz  probabaly you seeing 4 cores lets update scaling governers     echo 'performance' > /sys/devices/system/cpu/cpu0/cpufreq/scaling_governor     echo 'performance' > /sys/devices/system/cpu/cpu1/cpufreq/scaling_governor     echo 'performance' > /sys/devices/system/cpu/cpu2/cpufreq/scaling_governor     echo 'performance' > /sys/devices/system/cpu/cpu3/cpufreq/scaling_governor now your computer running as

How to Summarize Real Time Event Data Using Siddhi CEP

These days I am working on a task to improve the performance of a wso2telco analytics which is based wso2 DAS. Our product does data summarizing using apache spark. When the data increases it hangs even though we have applied wso2 incremental processing. Hence we decided to do one level data summarizing on real time as soon as data arrives at wso2 siddhi event processor. spark will do summarize per hour while siddhi will summarize per minute. To summarize data in real time we can use siddhi window feature. it can hold the events arrived up to a specified time and then release all the accumulated events at once. Since the events received are released at once we can use summarizing methods to summarize them. Following describes few attempts how I  tried to do this. Attempt 01 This was the attempt suggested by wso2 support team https://github.com/wso2/analytics-apim/blob/master/features/org.wso2.analytics.apim.feature/src/main/resources/template-manager/executionplans/APIMAn

Java Multithreaded Programming

A program that has two or parts can run concurrently. Each part is a thread. So this is a specialised form multitasking. Process based multitasking is heavyweight and use seperate address spaces. It gives the big picture of the application. JVM has control as it handled by os level. Thread based is light weight and use same address space. So inter communication between threads is easy. This gives detail view of the part it's running. JVM has full control. Multithreading reduces idle time by running another when one is waiting for something like slow network operation. And this avoids application being blocked for something. In a single core CPU no actual parallel threading is happening but CPU time is allocated between threads. Running thread can be suspended and resumed. Can be blocked when waiting. When terminated can't resume. Threads has priorities. Higher priority threads can take cpu power by pushing low or low priority threads can volunteeraly allow if its blocked

Java Head Dump Vs Thread Dump

JVM head dump is a snapshot of a JVM heap memory in a given time. So its simply a heap representation of JVM. That is the state of the objects. JVM thread dump is a snapshot of a JVM threads at a given time. So thats what were threads doing at any given time. This is the state of threads. This helps understanding such as locked threads, hanged threads and running threads. Head dump has more information of java class level information than a thread dump. For example Head dump is good to analyse JVM heap memory issues and OutOfMemoryError errors. JVM head dump is generated automatically when there is something like OutOfMemoryError has taken place.  Heap dump can be created manually by killing the process using kill -3 . Generating a heap dump is a intensive computing task, which will probably hang your jvm. so itsn't a methond to use offetenly. Heap can be analysed using tools such as eclipse memory analyser. Core dump is a os level memory usage of objects. It has more informaiton t

Apache Spark Hadoop Word Count Example

Let's create a simple word count example using Spark and Hadoop file system in Linux environment. I hope you have already installed apache-spark Now create a simple text file. Let's name it as "input.txt" and let the content of the file be people are not as beautiful as they look, as they walk or as they talk. they are only as beautiful as they love, as they care as they share. Start you spark by typing spark-shell on you Linux terminal. [cloudera@quickstart ~]$ spark-shell after some time you should be logged into Scala terminal. In order to get the word count load the file content split the words get the count The only command you need to get the count of the file is scala> sc.textFile("input.txt").flatMap(_.split(" ")).count When you trying to execute this command you will get an error saying org.apache.hadoop.mapred.InvalidInputException: Input path does not exist: hdfs://localhost:8020/user/input.txt This is because Hadoop

WSO2 Siddhi Advance Example

I would like to present you today how to write an advance siddhi query for wso2 das. I would skip basic steps hoping you are already familiar with siddhi event processing language and DAS server. Siddhi - Is an event processing language created by WSO2. this language has a very similar behavior to SQL query language, except siddhi query triggered by a receiving event. siddhi syntaxes bit upside down when compared to traditional SQL. DAS - DAS is the data analytic server introduced by WSO2. Introduction What my example doing is when DAS server is receiving an event it flows through my siddhi example. DAS is connected to an external system and DAS receives events. this event has json body where event-type is hidden. based on the event type we will be pricing the event and pushing updated stream data to a new stream. so to make a result we will have to process the event using siddhi java extension. Before explaining the siddhi query, let me explain you various functions

Printing your json on a web page

printing a json on wasnt easy. it looked messy and made your web page very bad looking. theres a simple solution to overcome this. stringify your json and then put inside a html <pre> tag. you will be amazed of the outcome. jsonBody    in the below example is the json object you wanna print.   <pre> ' + JSON . stringify ( JSON . parse ( jsonBody ) , null , '\t' ) + ' </pre>

Change Ubuntu user password

go to linux recovery mode pressing esc button till boot option menu appears. then select advance options then select recovery mode option. then you should see following ui. select update grub boot-loader option to get the read write access to the system. then select drop to root shell prompt option. then press enter to go to root terminal. now you can change the password easily root@ubuntu:~# passwd nuwans Enter new UNIX password: Retype new UNIX password: passwd: password updated successfully root@ubuntu:~#   you are done.   troubleshooting   you might see a error like key string is not matching once you logged in to the system. you can delete  the key-string file to create a new file.   rm ~/.local/share/keyrings/login.keyring    

Did you delete your fstab file in ubuntu 16.04 ?

  P ossible result would be hang your booting at a black screen.   F ix would be restoring the fstab file. Unfortuanatly if you don’t have a back up file have to recreate the file manually.   Y ou should be able to go to recovery mode by pressing esc button till boot option menu appears.  Then select advance options and then select recovery mode.   T hen should see a white ui window.  Select Update g rub bootloader to to get read write access to file system. Then select drop to root shell prompt. N ow sudo blkid file get partition info   sudo blkid   /dev/sda1: UUID="78985254-78ae-45b3-923d-804c7d56a7dc" TYPE="ext4" PARTUUID="ab55e511-01" /dev/sda2: UUID="a30e14b5-0203-4ef3-a135-2dad9c26c309" TYPE="swap" PARTUUID="ab55e511-02" /dev/sda3: UUID="377ce2fa-29c3-46d2-87da-88ff55502d66" TYPE="ext4" PARTUUID="ab55e511-03" Since