Registration is open - Live, Instructor-led Online Classes - Elasticsearch in March - Solr in April - OpenSearch in May. See all classes


Elastic Stack Import-Export with Logstash & Logsene

In earlier posts, we explained how one can reindex data from one Elasticsearch cluster to another, or within the same Elasticsearch cluster, via tools like Logstash and rsyslog.

The same recipes apply to Logsene, as it exposes the Elasticsearch API. Not only can you push data to Logsene with everything that talks to Elasticsearch (such as Logstash), but you can also use Elasticsearch’s Scroll API to export data from Logsene. All you need to remember is that with Logsene, you need to specify your app token as the index name.

Migrating data from your in-house ELK stack to Logsene

Let’s say you already have an Elastic stack deployed, but you want to migrate existing logs to Logsene. Maybe because you’re spending too much time and money on managing and scaling Elasticsearch, and you’d like to outsource that. Or because you’d like built-in features of Logsene like role-based access control or anomaly detection. Either way, you can migrate your data and keep using Elasticsearch-focused tools:

input {
  elasticsearch {
   hosts => ["localhost:9200"]
   index => "logstash-*"
  }
}

output {
  elasticsearch {
    hosts => "logsene-receiver.sematext.com:80"
    index => "DESTINATION_LOGSENE_APP_TOKEN"
    manage_template => false
  }
}

NOTE: Since Logsene plans are based on ingestion volume and retention, that initial import throughput spike may influence your costs. That shouldn’t be a problem if you just started and have a big enough trial plan. Even if the trial is over and go over the selected plan, you’ll pay at the same per-GB rate.

Reindexing data from one Logsene app to another

Let’s say you’re prototyping, you’re tweaking your Logstash grok rules, but you’d like to use a custom template. For the new template to apply, you’ll need a new index (i.e. a new Logsene app). So you can go ahead and create it, and then reindex the data from the first app with Logstash. Here’s a sample config (though you can also add filters to change data along the way). Except now, the source is not your in-house Elasticsearch cluster, but a Logsene app that already has logs you want to reindex:

input {
  elasticsearch {
   hosts => ["logsene-receiver.sematext.com:80"]
   index => "SOURCE_LOGSENE_APP_TOKEN"
  }
}

output {
  elasticsearch {
    hosts => "logsene-receiver.sematext.com:80"
    index => "DESTINATION_LOGSENE_APP_TOKEN"
    manage_template => false
  }
}

NOTE: If you want SSL encryption, just add ssl => true and change the port to 443.

Exporting data from Logsene

Even if Logsene comes with Amazon S3 log archiving, you might need to export your logs somewhere else using – you guessed it! – a similar config:

input {
  elasticsearch {
   hosts => ["logsene-receiver.sematext.com:80"]
   index => "LOGSENE_APP_TOKEN"
  }
}

output {
  file {
    path => "/mnt/big_disk/big_log"
  }
}

See? No lock-in! With Logsene you can also easily go back to self-hosted, if you want to build something custom around your ELK stack for example. We can actually help you with that, through Elasticsearch and logging trainings and through logging consulting.

Compatibility notes

Older versions of Elasticsearch (before 2.0) need to  have search_type set to scan so that scrolling to the data is efficient. Logstash does this by default up to version 2.3.x (the option was removed in 2.4). This means:

  • if you’re using Logstash 2.4 or later and the data source is Logsene or a local Elasticsearch cluster version 2.0 or later, the steps above will work
  • if you’re using Logstash up to 2.3.x with Logsene or a local Elasticsearch cluster version 2.0 or later, you need to either set scan => false or upgrade Logstash
  • if the data source is Elasticserch 1.x, you need Logstash 2.3.x (with scan => true) in order for scanning to be efficient

Start Free Trial