Hi there! new post that will guide you how to install couchdb on aws step by step Core deps and dev tools. Enable the epel and epel-source repos by editing the file /etc/yum.repos.d/epel.repo. Next install the deps and tools. sudo yum install gcc gcc-c++ libtool libicu-devel openssl-devel autoconf-archive erlang python27 python-sphinx help2man Get the SpiderMonkey JS Engine and build it. wget http://ftp.mozilla.org/pub/mozilla.org/js/js185-1.0.0.tar.gz tar xvfz js185-1.0.0.tar.gz cd js-1.8.5/js/src ./configure make sudo make install You should see it installed under /usr/local/lib Build CouchDB. Download the source package for CouchDB, unpack it and cd in. (https://www.apache.org/dyn/closer.lua?path=/couchdb/source/1.6.1/apache-couchdb-1.6.1.tar.gz) Point it to the required libs and configure. ./configure –with-erlang=/usr/lib64/erlang/usr/include –with-js-lib=/usr/local/lib/ –with-js-include=/usr/local/include/js/ make sudo make install Prepare the CouchDB installation. Make a couchdb user. sudo useradd...
Continue reading...
Hi there! Today i’ll explain how to create scheduled backup of AWS mysql data using snapshot script. The first step is to create an IAM user with permissions to do what our backup script requires. Create one in the IAM section of AWS console and in the Inline Policies area give it the following policy: { “Statement”: [ { “Effect”: “Allow”, “Action”: [ “ec2:CreateSnapshot”, “ec2:CreateTags”, “ec2:DeleteSnapshot”, “ec2:DescribeSnapshots”, “ec2:DescribeTags” ], “Resource”: [ “*” ] } ] } Be sure to save the IAM user credentials (AWS access key id and AWS secret access key) The next step is to create the script that will lock the mysql db at night (do it on slave instance to make sure your app will keep running during...
Continue reading...
In this post i’ll explain how to install nginx and wordpress on an Ubuntu server. * First make sure that your server’s security group allowes ports 80, 443. Step 1: Installation The first two commands are used to update the server’s sources and install all the neccesery utilities. [root@ubuntu ~] apt-get update [root@ubuntu ~] apt-get install nginx mysql-server php5-mysql php5-fpm Next make sure the the nginx server is running: [root@ubuntu ~] /etc/init.d/nginx start At this point go to your browesr and enter the public ip of your server. you should something like this: Step 2: Creating the database: Enter the database: [root@ubuntu ~]mysql -u root -p Enter the following commands (change the values as you like): [root@ubuntu ~]CREATE DATABASE wordpress;...
Continue reading...
Kibana is basically the visualisation tool of Elasticsearch. In this blog you can find the installation procees of all the parts of ELK – Elasticsearch, Logstash, Kibana. If you havn’t yet installed Elasticsearch and logsatsh feel free to click: How to install and configure Elasticsearch How to install and configure Logstash So first, let’s brifely go over the purpose of Kibana in the ELK stack This picture is very helpfull to understanding what is the purposes of Kibana. 1. The data of the logs is being collected by Logsatsh 2. Elasticsearch stores the data and allows full text search, structured search, performing analytics etc. 3.Visualise data – in a browser-based analytics and search dashboard Step 1: Installation The first step is getting the installation...
Continue reading...
In this post I will explain the very simple setup of Logstash on an EC2 server and a simple configuration that takes an input from a log file and puts it in Elasticsearch. If you don’t already have an Elasticsearch server feel free to click: how to install and configure elasticsearch in aws Step 1: Installation The first step is getting the installation from the official website: [root@logstash ~] wget https://download.elasticsearch.org/logstash/logstash/packages/centos/logstash-1.4.2-1_2c0f5a1.noarch.rpm next, install the rpm using yum: [root@logstash ~] yum install logstash-1.4.2-1_2c0f5a1.noarch.rpm Now that was easy…we’re done with the installation already Step 2: configuration For the configuration part, edit the following file: [root@logstash ~] vi /etc/logstash/conf.d/logstash.conf This is the main configuration file of logstash. let’s put a simple configuration that...
Continue reading...
Elasticsearch is a distributed, open source search and analytics engine. In this post I will show you the easiest way to install Elasticsearch and get it running in your AWS server. Step 1: installation The first step is downloading the installation from the official website using the wget command (don’t forget sudo su – first): [root@elasticsearch ~]wget https://download.elasticsearch.org/elasticsearch/elasticsearch/elasticsearch-1.4.1.noarch.rpm The second step is installing the package we downloaded using yum install: [root@elasticsearch ~]yum install elasticsearch-1.4.1.noarch.rpm Next simply enter the new elsaticsearch directory: [root@elasticsearch ~]cd /usr/share/elasticsearch/ In this directory you’ll need to install a few simply plugins. One of the is the special pluging for AWS so in this case it’s the most important one. So simply copy the following commands: [root@elasticsearch...
Continue reading...
Understanding the distinction between “private” and “public” subnets in Amazon VPC requires an understanding of how IP routing and network address translation (NAT) works in general, and how they are specifically implemented in VPC. The core differentiation between a public and private subnet in VPC is defined by what that subnet’s default route is, in the VPC routing tables.. This configuration, dictates the validity of using, or not using, public IP addresses on instances on that particular subnet. Each subnet has exactly one default route, which can be only one of two things: The VPC’s “Internet Gateway” object, in the case of a “public” subnet An EC2 instance, performing the “NAT instance” role, in the case of a “private” subnet....
Continue reading...