Graylog 2 Plugin Howto

Some Java language and ecosystem knowledge given, it’s not that hard to write your own plugin for the open source logging platform „Graylog“.
There is a good documentation available at docs.graylog.org/en/2.4/pages/plugins.html, although some infos and example GIT repos seem to be a bit out of date currently, maybe due to the fact that Graylog 3 is around the corner.

Preparations

Besides git and maven, you should install the free IntelliJ IDEA CE – I tried with Eclipse and Visual Studio Code first, but didn’t get everything working there.

You will also need ElasticSearch and MongoDB running locally – the easiest way usually is with Docker – you can just create a docker-compose.yml file and fire up two containers, e.g.:

version: '3.1'

services:
  mongo:
    image: mongo
    restart: always
    ports:
      - 27017:27017

  mongo-express:
    image: mongo-express
    restart: always
    ports:
      - 8081:8081

  elasticsearch:
    image: elasticsearch:5.6
    restart: always
    ports:
      - 9200:9200
      - 9300:9300

and run „docker-compose up -d“ to launch.

Once those tools are setup, you are going to need two different repositories to start:
First, install the Graylog Project Cli from github.com/Graylog2/graylog-project-cli
After that follow the README at
github.com/Graylog2/graylog-project
and bootstrap a new Graylog project with:

graylog-project bootstrap github://Graylog2/graylog-project.git

This will boostrap the latest „master“ version of Graylog, currently this is 3.0.0-alpha.5. If you want a different, e.g. the latest stable version, add the „manifest“ parameter (see here), e.g.:

--manifest="manifests/2.5.json"

Now comes the first caveat – there is no „scripts“ dir anymore inside „graylog-project“ (see github.com/Graylog2/documentation/issues/473), which you’ll need later on to create some boilerplate code for your new plugin. I assume this is going to be fixed / replaced once Graylog 3 is final. So currently you need to get the „scripts“ from an older branch at github.com/Graylog2/graylog-project/tree/pre-graylog-project-cli
You can just download the branch as zip and extract the „scripts“ folder into your „graylog-project“ dir which was created in the step before.

UPDATE: you can get along without the „scripts“ dir to generate a new plugin (see e.g. here), simply use Maven directly. Inside the folder „graylog-project-repos“, open a terminal and enter:

mvn archetype:generate -DarchetypeGroupId=org.graylog -DarchetypeArtifactId=graylog-plugin-archetype

and enter the required plugin info. (You will still need to fix the plugin path in the main project POM and the parent project version and yarn.version variable in the plugin POM, see below.)

Continue following the README and import the „graylog-project“ as a new IntelliJ project and also create a new „Run“ configuration as explained.

After that, open the IntelliJ terminal window (or an external console if you prefer), change to the „graylog-project“ folder and run „maven clean install“ to generate and store all required project artifacts, otherwise you won’t be able to run your plugins‘ maven targets later because of missing dependency jar files, e.g. for graylog-server.jar

Finally, create a „graylog.conf“ file as explained here. Remember you have to fill at least two variables in there: password_secret and root_password_sha2. See the comments in „graylog.conf“ how to set those.

Creating your own plugin

If you have „installed“ the „scripts“ folder and didn’t use Maven to generate the plugin (see above), open „scripts/includes“ now and change „modulesPrefix“ to:

modulesPrefix=../graylog-project-repos

so that the „bootstrap-plugin“ command can find its dependencies. To create a new plugin skeleton now, stay in the „graylog-project“ folder and run:

scripts/bootstrap-plugin your-plugin-name

and enter the info requested, e.g.:


You can ignore the build and „file not found“ errors at the end for now – the script still has some path problems and there is no file „build.config.js.sample“, but it should have generated your new plugin folder and files.
Now move the newly created „graylog-plugin-your-plugin-name“ folder one level above into the „graylog-project-repos“ folder (if you’ve used Maven to generate the plugin, it should already be there, see above!).

Now, for both cases (Maven or „scripts“), adjust the „graylog-project/pom.xml“ file from

<module>graylog-plugin-your-plugin-name</module>

to

<module>../graylog-project-repos/graylog-plugin-your-plugin-name</module>

(or add the entry if not there) and you should see it in your IntelliJ project folders. Also in „graylog-project-repos/graylog-plugin-your-plugin-name/pom.xml“ fix the version for „graylog-plugin-web-parent“ –
for me, it was „3.0.0-alpha.3-SNAPSHOT“ although I was on Alpha 5 (check the version you are working with and change accordingly):

<parent>
    <groupId>org.graylog.plugins</groupId>
    <artifactId>graylog-plugin-web-parent</artifactId>
    <version>3.0.0-alpha.5-SNAPSHOT</version>
    <relativePath>../graylog2-server/graylog-plugin-parent/graylog-plugin-web-parent</relativePath>
</parent>

And, to finish it, in that same file change the following line, otherwise IntelliJ gives you the error: „Element yarn.version is not allowed here“ – that’s probably some typo in the boilerplate code.

<!-- 
<yarn.version>${yarn.version}</yarn.version>
change to -->
<yarnVersion>${yarn.version}</yarnVersion>

Now everything should be green in your IDE and you can e.g. run „mvn package“ for your plugin or do the same inside IntelliJ – in the „Maven“ Tab,  first hit the left double arrows icon on top to „Reimport All Maven Projects“, then expand your new plugin, expand „Lifecycle“ and right click on „package“ to select „Run Maven Build“:

If all goes well, you should get the message „Process finished with exit code 0“ and you will find a newly created *.jar file in the „target“ directory of your plugin folder. All that is left to do is copy it over to the „graylog-project-repos/graylog2-server/plugins“ folder (create it if it is not there yet) – this location is defined in „graylog-project-repos/graylog2-server/graylog.conf“.

After that, start your new „Run configuration“ in IntelliJ to fire up the Graylog server and watch the logs if your plugin shows up. For the web interface, change to „graylog2-server/graylog2-web-interface“ and run

./node/yarn/dist/bin/yarn start

See also github.com/Graylog2/graylog-project#web-interface-start

That should get you started – at least this is how I was able to get my „Json Remote Polling“ Input plugin working. To get an overview of plugin types, check out docs.graylog.org/en/2.4/pages/plugins.html#plugin-types. For details on what’s inside a Graylog plugin see docs.graylog.org/en/2.4/pages/plugins.html#the-anatomy-of-a-plugin. Happy logging! 🙂

PHP MySQL GELF und JSON Polling Graylog Plugins verfügbar

Bei einem Projekt ergab sich kürzlich die Notwendigkeit, MySQL Statusinformationen in ein Graylog Logging System zu bekommen. Allerdings durfte der Webserver durch die Firmen-Firewall nur über die Standard-Ports 80 und 443 nach aussen kommunizieren und konnte so nicht z.B. über UDP selbst Daten an Graylog senden.
Daher entstanden zwei kleine Graylog-Plugins, um einerseits die Daten (mittels PHP) über das Internet verfügbar zu machen und andererseits diese Daten vom Graylog-Server aus mittels „Polling“ abzuholen. Dies übernimmt ein sog. Graylog Input-Plugin (Java).  Beide Plugins stehen im Graylog Marketplace kostenlos zum Download zur Verfügung.

How to call methods of a VueJS app from outside

While working on my VueJs product configurator I faced the challenge to call functions in a VueJs app from „outside“, meaning the VueJs app is e.g. integrated into a PHP shopping application like Oxid eShop, Shopware or Magento and I have to e.g. check the state of the current configuration inside the VueJs app.

In particular, I had to check if the configuration is finished when the „to basket“ button on the detail page of the shop is clicked and of course get the configured product back from the app to add it to the basket.

One way I’ve found to achieve this it to save the VueJs instance into a global window variable like this:

window.confApp = new Vue({
    el:’#configurator‘,
    render(h) {
        return h(App, {
            props: {
                apiUrl:this.$el.attributes.apiUrl.value
            }
        })
    }
})

 

This way, you can access it from anywhere on the page and add a script to a Smarty template of the shop e.g.:

$(function() {
    $(‚#toBasket‘).on(‚click‘, function(e) {
        console.log(window.confApp);
        return true;
    }
});

 

Now, to find the Vue Component inside the Vue object you can inspect the object in the browser console, it is probably nested in the „$children“ array of the Vue instance:

You can see the functions of the Component in the screenshot, the one we need is the „exportConfiguration“ which can now be called like this:

$(function() {
    $(‚#toBasket‘).on(‚click‘, function(e) {
        // get global configurator VUE instance

        var configurator = window.confApp.$children[0];
        var configuration = configurator.exportConfiguration();
        if (!configuration) {
            return false;
        }
        return true;
     }
});

 

Shopware 5.3. Plugins erstellen mit den Shopware CLI Tools

Die Shopware CLI Tools ermöglichen es, das Erstellen von Plugins erheblich zu beschleunigen – egal ob es um Frontend-, Backend- oder API-Plugins (oder eine beliebige Kombination davon) geht. Im folgenden ein Beispiel für ein solches „Multi-Feature“-Plugin.

Shopware CLI Tools Installation

Wir erstellen mit den CLI Tools ein Plugin mit dem einfallsreichen Namen „MyTest01“, hierzu ist folgendes nötig:

  1. Die Shopware CLI Tools gehören nicht zum Shopware „Lieferumfang“ und müssen erst installiert werden, empfohlen wird hier der Download eines „Phar“-Releases.
  2. Man kann das Phar z.B. im Shopware-Hauptverzeichnis nach „bin/“ kopieren und der Einfachheit halber in „sw“ umbenennen.

Shopware CLI Tools Verwendung

Danach sind die Tools einsatzbereit. Hier ein Aufruf, welcher ein Plugin mit Frontend-, Backend- und API-Komponenten generiert:

./bin/sw plugin:create –haveApi –haveBackend –haveModels –haveFrontend –haveCommands –backendModel=MyTest01\\Models\\MyTest01Model MyTest01

Problem mit Backend Model CLI Generierung

Lässt man den Parameter „–backendModel“ beim Aufruf weg, so kann bzw, muss man das Model interaktiv in der CLI angeben:

Wichtig ist in beiden Fällen, den kompletten Pfad bzw. Namespace mit anzugeben! Allerdings hatte ich mit der interaktiven Methode keinen Erfolg, es wurden keine gültigen Dateinamen generiert, sondern die kompletten CLI-Kommandos inkl. Parameter wurden als Namen verwendet – was zu ungültigen Datei- und Klassenamen wie diesen führt:

class ‚plugin:create‘ –haveApi –haveBackend –haveModels –haveFrontend –haveCommands MyTest02 extends ShopwareCommand

Kann evtl. an noch fehlender Kompatiblität mit Shopware 5.3. liegen, ich habe bisher mit älteren Versionen nicht getestet – oder es hat sich aktuell ein Bug beim Verarbeiten der Parameter eingeschlichen.

Übergibt man das „backendModel“ allerdings als Parameter, klappt die Generierung soweit:

Nach dem Generieren sollte man die Plugin-Liste aktualisieren (und ggf. den Cache leeren) mit:

./bin/console sw:plugin:refresh
./bin/console sw:cache:clear

Probleme mit der API Generierung

Allerdings gibt es dann noch weitere kleine Probleme mit dem generierten Code, zumindest in der aktuellen Version 5.3 und falls man den Parameter „–haveApi“ bei der Generierung angibt.

Das erste Problem ist ein Fatal Error, der auftritt und ungefähr so lautet:

PHP Fatal error: Uncaught TypeError:
Argument 1 passed to MyTest01\\Subscriber\\ApiSubscriber::__construct() must be an instance of
Shopware\\Recovery\\Common\\DependencyInjection\\ContainerInterface, instance of
ShopwareProduction4a3b86fd9a3e955627adbda985118ae3e2bdc589ProjectContainer given

Sieht man sich die generierten Klassen an und vergleicht mit ähnlichen Klassen im Shopware Sourcecode selbst, findet man irgendwann das Problem: in der generierten Datei  „MyTest01\Subscriber\ApiSubscriber.php“ wird ein falsches Interface verwendet, man muss also folgendes ändern:

//use Shopware\Recovery\Common\DependencyInjection\ContainerInterface;
use Symfony\Component\DependencyInjection\ContainerInterface;

Ausserdem muss, falls nicht vorhanden, dem Subscriber der „service_container“ als Argument in der „services.xml“-Datei mitgegeben werden, z.B.

<service id=“my_test01.subscriber.api_subscriber“ class=“MyTest01\Subscriber\ApiSubscriber“>
<argument id=“service_container“ type=“service“/>
<tag name=“shopware.event_subscriber“/>
</service>

 

Das letzte Problem schliesslich tritt auf, wenn man versucht, die API URL „/api/mytest01model“ aufzurufen (nachdem man vorher im Backend einen API-Key generiert hat, den man dann für das Login als Passwort nutzen kann), gibt es ebenfalls einen „Fatal Error“ a lá:

PHP Fatal error: Uncaught Error: Class ‚Shopware\\Components\\Api\\Resource\\MyTest01Model‘ not found in
/var/www/html/engine/Shopware/Components/Api/Manager.php:55\nStack trace:\n#0
/var/www/html/custom/plugins/MyTest01/Controllers/Api/MyTest01Model.php(13):
Shopware\\Components\\Api\\Manager::getResource(‚MyTest01Model‘)\n#1 /var/www/html/engine/Library/Enlight/Class.php(74):
Shopware_Controllers_Api_MyTest01Model->init()\n#2 /var/www/html/engine/Library/Enlight/Controller/Action.php(101):

Hier muss man zwei Dinge anpassen:

in „MyTest01\Components\Api\Resource\MyTest01Model.php“ den Namespace ändern auf:

//namespace Shopware\Components\Api\Resource;
namespace MyTest01\Components\Api\Resource;

sowie in „Resources\services.xml“ den neuen Service definieren:

<service id=“shopware.api.mytest01model“ class=“MyTest01\Components\Api\Resource\MyTest01Model“>
</service>

Fall übrigens durch ein fehlerhaftes Modul auch das Backend nicht mehr funktioniert, kann man das betroffene Modul über die Kommandozeile deaktivieren mit:

./bin/console sw:plugin:uninstall MyTest01

Unser Testmodul sollte nun jedoch funktionieren und nicht nur per API aufrufbar sein, sondern auch im Backend auftauchen:

Happy coding! 🙂

Use J. Wilders nginx-proxy in multiple docker-compose projects

There is an awesome project for Docker if you want to run e.g. multiple webserver containers on the same ports on one host machine, say Apache or Nginx on port 80: jwilder/nginx-proxy.

Nginx-Proxy for Docker

You have to expose the port 80 in the Dockerfile as usual, but you don’t explicitly map the port in your docker-compose.yml or when using „docker run …“. Instead, you let the nginx-proxy do the heavy work and forward the requests to the right container. Therefore, you add an environment variable for the proxy:

environment:
 VIRTUAL_HOST: myapp.dev.local

so that it knows which request to forward to which container.

If you want to start multiple docker-compose.yml files, you can’t just add the nginx-proxy container to all the docker-compose.yml files though. If you only had one docker-compose project with e.g. multiple webservers on port 80, you could just add one proxy container to your YAML:

nginx-proxy:
    image: jwilder/nginx-proxy
    container_name: nginx-proxy
    ports:
      - "80:80"
    volumes:
      - /var/run/docker.sock:/tmp/docker.sock:ro

The Problem

But if you have multiple projects, there would be conflicts with this approach since there can only be one container with any given name – and you do only want one nginx-proxy across projects after all! Unfortunately, docker (compose) does not allow existing containers (yet?) and throws an error if you try to start the same container multiple times.

If you want to share the proxy container for different projects, you should also use an external network in your docker-compose.yml files like so (see github.com/jwilder/nginx-proxy/issues/552):

networks:
  default:
    external:
      name: nginx-proxy

Be aware that if you do this, you have to manually create the network before you run „docker-compose up -d“:

docker network create nginx-proxy

The Solution

solution for using the proxy accross projects would be to check for the network and nginx-proxy container before each call to „docker-compose up -d“. One way to do this is with a Makefile, e.g. in your „make start“ or „make up“ commands, you could call a shell script which does those checks for you:

start:
 ./config/run-proxy.sh
 docker-compose start

up:
 ./config/run-proxy.sh
 docker-compose up -d

This way, the script would create the required network and/or the proxy container if either of them doesn’t exist yet. So all the running projects / containers can share the global proxy container in the global network.

The Details

So, here is an example docker-compose.yml and also an example bash script (run-proxy.sh):

#!/bin/bash
##########################################################################
# script to check if the jwilder proxy container is already running
# and if the ngnix-proxy network exists
# should be called before "docker-compose up -d"
##########################################################################

if [ ! "$(docker network ls | grep nginx-proxy)" ]; then
  echo "Creating nginx-proxy network ..."
  docker network create nginx-proxy
else
  echo "nginx-proxy network exists."
fi

if [ ! "$(docker ps | grep nginx-proxy)" ]; then
    if [ "$(docker ps -aq -f name=nginx-proxy)" ]; then
        # cleanup
        echo "Cleaning Nginx Proxy ..."
        docker rm nginx-proxy
    fi
    # run your container in our global network shared by different projects
    echo "Running Nginx Proxy in global nginx-proxy network ..."
    docker run -d --name nginx-proxy -p 80:80 --network=nginx-proxy -v /var/run/docker.sock:/tmp/docker.sock:ro jwilder/nginx-proxy
else
  echo "Nginx Proxy already running."
fi

And, for reference – an example docker-compose.yml:

version: '2'
services:

  shopware:
    image: docker.myregistry.de/docker/php7-apache/image
    container_name: appswdemo
    environment:
     VIRTUAL_HOST: shopware.dev.local
     VIRTUAL_PORT: 80
     DB_HOST: db
     SHOPWARE_VERSION: 5.3
    volumes:
     - ./config/config.php:/var/www/html/config.php
     - ./src/pluginslocal:/var/www/html/engine/Shopware/Plugins/Local
     - ./src/plugins:/var/www/html/custom/plugins
     - ./src/customtheme:/var/www/html/themes/customtheme
    links:
    - db

  # data only container for persistence
  dbdata:
    container_name: dbdataswdemo
    image: mysql:5.6
    entrypoint: /bin/bash
  db:
    image: mysql:5.6
    container_name: dbswdemo
    environment:
        MYSQL_ROOT_PASSWORD: root
        MYSQL_DATABASE: shopware
        MYSQL_USER: shopware
        MYSQL_PASSWORD: shopware
        TERM: xterm
    volumes_from:
      - dbdata

  phpmyadmin:
    image: phpmyadmin/phpmyadmin
    environment:
      VIRTUAL_HOST: shopwaredb.dev.local
      VIRTUAL_PORT: 8080
      PMA_ARBITRARY: 1
      MYSQL_USER: shopware
      MYSQL_PASSWORD: shopware
      MYSQL_ROOT_PASSWORD: root
    links:
      - "db:db"

networks:
  default:
    external:
      name: nginx-proxy

As you can see, the web container („shopware“ in this example), which runs Apache and PHP 7 in this case, doesn’t map any explicit ports, it only tells the proxy its URL and „virtual port“, but there is no „ports:“ section in the YML file. Same goes for the „phpmyadmin“ container.

And finally, the relevant parts of the Makefile:

ARGS = $(filter-out $@,$(MAKECMDGOALS))
MAKEFLAGS += --silent
start:
 ./run-proxy.sh
 docker-compose start
up:
 ./run-proxy.sh
 docker-compose up -d
#############################
# Argument fix workaround
#############################
%:
 @:

nginx-proxy would now forward all requests to shopware.dev.local to the PHP / Apache container on port 80 and also shopwaredb.dev.local to the PhpMyAdmin container on Port 8080, and you could start more containers on port 80 and 8080 (PhpMyAdmin) without any port conflicts on your host!

Tools und Tipps vom Developercamp 2017

Nach dem Camp ist auf dem Camp – erst das Devops Camp letztes Wochenende, nun das Developer Camp im Rahmen der Webweek Nürnberg.

Auch auf dem DevCamp gibt es natürlich zahlreiche interessante Sessions mit vielen Tools und Tipps für den Entwickler-Alltag. Hier eine selektive Auswahl aus Sessions, die ich bisher besucht habe – viel Spass! 🙂

Webassembly

React Native

Big Data

Ramda.js / Functional programming

NPM

Monolith zu Microservice

REST-Backend

Ansible

Documentation

Deep Learning

Sonstiges