Merge pull request #399 from pwnbus/master

Update MozDef to use ES v2
This commit is contained in:
A Smith 2017-06-27 15:30:21 -05:00 коммит произвёл GitHub
Родитель eca3ffd4bc 6c95e30b8d
Коммит ed2f6e02c2
363 изменённых файлов: 138893 добавлений и 9894 удалений

44
.github/ISSUE_TEMPLATE.md поставляемый Normal file
Просмотреть файл

@ -0,0 +1,44 @@
<!---
Verify first that your issue/request is not already reported on GitHub.
Also test if the latest release, and master branch are affected too.
-->
#### ISSUE TYPE
<!--- Pick one below and delete the rest: -->
- Bug Report
- Feature Idea
- Documentation Report
#### COMPONENT NAME
<!--- Name of the cron/worker/module/plugin/task/feature -->
#### CONFIGURATION
<!---
Mention any settings you have changed/added/removed
-->
#### OS / ENVIRONMENT
<!---
Mention the OS you are running MozDef from and versions of all MozDef components you are running
-->
#### DESCRIPTION
<!--- Explain the problem briefly -->
#### STEPS TO REPRODUCE
<!---
For bugs, show exactly how to reproduce the problem, using a minimal test-case.
For new features, show how the feature would be used.
-->
<!--- Paste any log messages/configurations/scripts below that are applicable -->
<!--- You can also paste gist.github.com links for larger files -->
#### EXPECTED RESULTS
<!--- What did you expect to happen when running the steps above? -->
#### ACTUAL RESULTS
<!--- What actually happened? -->

7
.gitignore поставляемый
Просмотреть файл

@ -3,3 +3,10 @@ meteor/packages
*.pyc
celebeat-schedule.*
results
*.log
.coverage
.cache
*.pyc
cron/ipblocklist.txt
alerts/generic_alerts
/.project

6
.gitmodules поставляемый
Просмотреть файл

@ -1,6 +0,0 @@
[submodule "lib/mozdef_client"]
path = lib/mozdef_client
url = https://github.com/gdestuynder/mozdef_client
[submodule "bot/modules/bugzilla"]
path = bot/modules/bugzilla
url = https://github.com/gdestuynder/simple_bugzilla

Просмотреть файл

@ -2,28 +2,28 @@ Contributing to MozDef
======================
MozDef is an open source project and we love to receive contributions from the community. There are many ways to contribute, from writing documentation, submitting bug reports and feature requests or writing code which can be incorporated into MozDef.
We would also love to hear how you are using MozDef and to receive contributions to make it easier to deploy and integrate.
We would also love to hear how you are using MozDef and to receive contributions that make it easier to deploy and integrate.
Bug reports
-----------
If you think you have found a bug in MozDef, first make sure that you are testing against the [master branch](https://github.com/mozilla/MozDef) - your issue may already have been fixed. If not, search our [issues list](https://github.com/mozilla/MozDef/issues) on GitHub in case a similar issue has already been opened.
If you think you have found a bug in MozDef, first make sure that you are testing against the [master branch](https://github.com/mozilla/MozDef) - your issue may already have been fixed. If not, search our [issues list](https://github.com/mozilla/MozDef/issues) on GitHub in the event a similar issue has already been opened.
It is very helpful if you can prepare a reproduction of the bug. In other words, provide a small test case which we can run to confirm your bug. It makes it easier to find the problem and to fix it.
It is very helpful if you can provide enough information to replicate the bug. In other words, provide a small test case which we can run to confirm your bug. It makes it easier to find the problem and resolve it.
Provide as much information as you can. The easier it is for us to recreate your problem, the faster it is likely to be fixed.
Provide as much information as you can. The easier it is for us to recreate your problem, the faster we can fix it.
Feature requests
----------------
If you are looking for a feature that doesn't exist currently in MozDef, you are probably not alone.
Open an issue on our [issues list](https://github.com/mozilla/MozDef/issues) on GitHub which describes the feature you would like to see, why you need it, and how it should work.
Open an issue on our [issues list](https://github.com/mozilla/MozDef/issues) on GitHub which describes the feature you would like to see, the value it provides, and how it should work.
If you attach diagrams or mockups, it would be super nice ;-).
Contributing code and documentation changes
-------------------------------------------
If you have a bugfix or new feature that you would like to contribute to MozDef, please find or open an issue about it first. Talk about what you would like to do. It may be that somebody is already working on it, or that there are particular issues that you should know about before implementing the change.
If you have a bugfix or new feature that you would like to contribute to MozDef, please search through our issues and see if one exists, or open an issue about it first. Explain what you would like to do. It is possible someone has already begun to work on it, or that there are existing issues that you should know about before implementing the change.
We enjoy working with contributors to get their code accepted. There are many approaches to fixing a problem and it is important to find the best approach before writing too much code.
@ -36,4 +36,4 @@ You will need to fork the main [MozDef repository](https://github.com/mozilla/Mo
Push your local changes to your forked copy of the repository and [submit a pull request](https://help.github.com/articles/using-pull-requests). In the pull request, describe what your changes do and mention the number of the issue where discussion has taken place, eg "Closes #123".
Then sit back and wait. There will probably be discussion about the pull request and, if any changes are needed, we would love to work with you to get your pull request merged into MozDef.
Then sit back and wait. There will probably be discussion about the pull request, and if any changes are needed, we would love to work with you to get your pull request merged into MozDef.

16
alerts/alertPlugins.ini Normal file
Просмотреть файл

@ -0,0 +1,16 @@
[uwsgi]
chdir = /opt/mozdef/envs/mozdef/alerts/
uid = mozdef
mule = alertWorker.py
pyargv = -c /opt/mozdef/envs/mozdef/alerts/alertWorker.conf
log-syslog = alertplugins-worker
log-drain = generated 0 bytes
socket = /opt/mozdef/envs/mozdef/alerts/alertPlugins.socket
virtualenv = /opt/mozdef/envs/mozdef/
master-fifo = /opt/mozdef/envs/mozdef/alerts/alertPlugins.fifo
procname-master = [m]
procname-prefix = [alertPlugins]
never-swap
pidfile= /var/run/mozdef-alerts/alertPlugins.pid
vacuum = true
enable-threads

Просмотреть файл

@ -1,16 +0,0 @@
[uwsgi]
chdir = /home/mozdef/envs/mozdef/alerts/
uid = mozdef
mule = alertWorker.py
pyargv = -c /home/mozdef/envs/mozdef/alerts/alertWorker.conf
daemonize = /home/mozdef/envs/mozdef/logs/uwsgi.alertPluginsmules.log
;ignore normal operations that generate nothing but normal response
log-drain = generated 0 bytes
log-date = %%a %%b %%d %%H:%%M:%%S
socket = /home/mozdef/envs/mozdef/alerts/alertPluginsmules.socket
virtualenv = /home/mozdef/envs/mozdef/
master-fifo = /home/mozdef/envs/mozdef/alerts/alertPluginsmules.fifo
never-swap
pidfile= /home/mozdef/envs/mozdef/alerts/alertPluginsmules.pid
vacuum = true
enable-threads

4
alerts/alertWorker.conf Normal file
Просмотреть файл

@ -0,0 +1,4 @@
[options]
mqalertserver=localhost
mquser=mozdef
mqpassword=mozdef

Просмотреть файл

@ -7,6 +7,7 @@
#
# Contributors:
# Jeff Bryner jbryner@mozilla.com
# Brandon Myers bmyers@mozilla.com
#
# Alert Worker to listen for alerts and call python plugins
# for user-controlled reaction to alerts.
@ -24,6 +25,7 @@ from logging.handlers import SysLogHandler
from operator import itemgetter
logger = logging.getLogger()
logger.level = logging.INFO
formatter = logging.Formatter('%(asctime)s - %(name)s - %(levelname)s - %(message)s')
@ -83,7 +85,7 @@ def dict2List(inObj):
yield l
elif isinstance(v, dict):
for l in dict2List(v):
yield l
yield l
else:
yield v
else:
@ -114,6 +116,8 @@ def sendEventToPlugins(anevent, pluginList):
sys.stderr.write('TypeError on set intersection for dict {0}'.format(anevent))
return anevent
if send:
message_log_str = '{0} received message: ({1}) {2}'.format(plugin[0].__module__, anevent['utctimestamp'], anevent['summary'])
logger.info(message_log_str)
anevent = plugin[0].onMessage(anevent)
return anevent
@ -231,7 +235,7 @@ def initConfig():
options.mqpassword = getConfig('mqpassword', 'guest', options.configfile)
options.mqport = getConfig('mqport', 5672, options.configfile)
# mqack=True sets persistant delivery, False sets transient delivery
options.mqack = getConfig('mqack', True, options.configfile)
options.mqack = getConfig('mqack', True, options.configfile)
if __name__ == '__main__':

50
alerts/amoFailedLogins.py Normal file
Просмотреть файл

@ -0,0 +1,50 @@
#!/usr/bin/env python
# This Source Code Form is subject to the terms of the Mozilla Public
# License, v. 2.0. If a copy of the MPL was not distributed with this
# file, You can obtain one at http://mozilla.org/MPL/2.0/.
# Copyright (c) 2014 Mozilla Corporation
#
# Contributors:
# Anthony Verez averez@mozilla.com
from lib.alerttask import AlertTask
from query_models import SearchQuery, TermMatch, ExistsMatch, PhraseMatch
class AlertFailedAMOLogin(AlertTask):
def main(self):
search_query = SearchQuery(minutes=10)
search_query.add_must([
TermMatch('_type', 'addons'),
TermMatch('details.signatureid', 'authfail'),
ExistsMatch('details.sourceipaddress'),
PhraseMatch('details.msg', "The password was incorrect"),
ExistsMatch('details.suser')
])
self.filtersManual(search_query)
# Search aggregations, keep X samples of events at most
self.searchEventsAggregated('details.suser', samplesLimit=15)
# alert when >= X matching events in an aggregation
self.walkAggregations(threshold=20)
# Set alert properties
def onAggregation(self, aggreg):
# aggreg['count']: number of items in the aggregation, ex: number of failed login attempts
# aggreg['value']: value of the aggregation field, ex: toto@example.com
# aggreg['events']: list of events in the aggregation
category = 'addons'
tags = ['addons']
severity = 'NOTICE'
summary = ('{0} amo failed logins: {1}'.format(aggreg['count'], aggreg['value']))
# append most common ips
ips = self.mostCommon(aggreg['allevents'],'_source.details.sourceipaddress')
for i in ips[:5]:
summary += ' {0} ({1} hits)'.format(i[0], i[1])
# Create the alert object based on these properties
return self.createAlertDict(summary, category, tags, aggreg['events'], severity)

56
alerts/auditd_sftp.py Normal file
Просмотреть файл

@ -0,0 +1,56 @@
#!/usr/bin/env python
# This Source Code Form is subject to the terms of the Mozilla Public
# License, v. 2.0. If a copy of the MPL was not distributed with this
# file, You can obtain one at http://mozilla.org/MPL/2.0/.
# Copyright (c) 2014 Mozilla Corporation
#
# Contributors:
# Anthony Verez averez@mozilla.com
# Jeff Bryner jbryner@mozilla.com
# Aaron Meihm ameihm@mozilla.com
# Michal Purzynski <mpurzynski@mozilla.com>
# Alicia Smith <asmith@mozilla.com>
from lib.alerttask import AlertTask
from query_models import SearchQuery, TermMatch, PhraseMatch
class AlertSFTPEvent(AlertTask):
def main(self):
search_query = SearchQuery(minutes=5)
search_query.add_must([
TermMatch('_type', 'auditd'),
TermMatch('category', 'execve'),
TermMatch('processname', 'audisp-json'),
TermMatch('details.processname', 'ssh'),
PhraseMatch('details.parentprocess', 'sftp'),
])
self.filtersManual(search_query)
self.searchEventsSimple()
self.walkEvents()
# Set alert properties
def onEvent(self, event):
category = 'execve'
severity = 'NOTICE'
tags = ['audisp-json, audit']
srchost = 'unknown'
username = 'unknown'
directory = 'unknown'
x = event['_source']
if 'details' in x:
if 'hostname' in x['details']:
srchost = x['details']['hostname']
if 'originaluser' in x['details']:
username = x['details']['originaluser']
if 'cwd' in x['details']:
directory = x['details']['cwd']
summary = 'SFTP Event by {0} from host {1} in directory {2}'.format(username, srchost, directory)
# Create the alert object based on these properties
return self.createAlertDict(summary, category, tags, [event], severity)

Просмотреть файл

@ -1,42 +0,0 @@
#!/usr/bin/env python
# This Source Code Form is subject to the terms of the Mozilla Public
# License, v. 2.0. If a copy of the MPL was not distributed with this
# file, You can obtain one at http://mozilla.org/MPL/2.0/.
# Copyright (c) 2014 Mozilla Corporation
#
# Contributors:
# Anthony Verez averez@mozilla.com
from lib.alerttask import AlertTask
class AlertBroIntel(AlertTask):
def main(self):
# look for events in last 30 mins
date_timedelta = dict(minutes=30)
# Configure filters by importing a kibana dashboard
self.filtersFromKibanaDash('bro_intel_dashboard.json', date_timedelta)
# Search aggregations on field 'seenindicator', keep 50 samples of events at most
self.searchEventsAggregated('seenindicator', samplesLimit=50)
# alert when >= 5 matching events in an aggregation
self.walkAggregations(threshold=5)
# Set alert properties
def onAggregation(self, aggreg):
# aggreg['count']: number of items in the aggregation, ex: number of failed login attempts
# aggreg['value']: value of the aggregation field, ex: toto@example.com
# aggreg['events']: list of events in the aggregation
category = 'bro'
tags = ['bro']
severity = 'NOTICE'
summary = ('{0} {1}: {2}'.format(aggreg['count'], 'bro intel match', aggreg['value']))
# append first 3 source IPs
summary += ' sample sourceips: '
for e in aggreg['events'][:3]:
if 'sourceipaddress' in e['_source']['details'].keys():
summary += '{0} '.format(e['_source']['details']['sourceipaddress'])
# Create the alert object based on these properties
return self.createAlertDict(summary, category, tags, aggreg['events'], severity)

Просмотреть файл

@ -1,331 +0,0 @@
{
"title": "",
"services": {
"query": {
"idQueue": [
1
],
"list": {
"0": {
"query": "",
"alias": "",
"color": "#7EB26D",
"id": 0,
"pin": false,
"type": "lucene",
"enable": true
}
},
"ids": [
0
]
},
"filter": {
"idQueue": [
1
],
"list": {
"0": {
"type": "time",
"field": "utctimestamp",
"from": "now-1h",
"to": "now",
"mandate": "must",
"active": true,
"alias": "",
"id": 0
},
"1": {
"type": "querystring",
"field": "category",
"query": "brointel",
"mandate": "must",
"active": true,
"alias": "",
"id": 1
},
"2": {
"type": "querystring",
"query": "_type:event",
"mandate": "must",
"active": true,
"alias": "",
"id": 2
},
"3": {
"type": "querystring",
"query": "_exists_:details.seenindicator",
"mandate": "must",
"active": true,
"alias": "",
"id": 3
}
},
"ids": [
0,
1,
2,
3
]
}
},
"rows": [
{
"title": "Graph",
"height": "350px",
"editable": true,
"collapse": true,
"collapsable": true,
"panels": [
{
"span": 8,
"editable": true,
"group": [
"default"
],
"type": "histogram",
"mode": "count",
"time_field": "utctimestamp",
"value_field": null,
"auto_int": true,
"resolution": 100,
"interval": "1s",
"fill": 3,
"linewidth": 3,
"timezone": "browser",
"spyable": true,
"zoomlinks": true,
"bars": false,
"stack": true,
"points": false,
"lines": true,
"legend": true,
"x-axis": true,
"y-axis": true,
"percentage": false,
"interactive": true,
"queries": {
"mode": "all",
"ids": [
0
]
},
"title": "Events over time",
"intervals": [
"auto",
"1s",
"1m",
"5m",
"10m",
"30m",
"1h",
"3h",
"12h",
"1d",
"1w",
"1M",
"1y"
],
"options": true,
"tooltip": {
"value_type": "cumulative",
"query_as_alias": true
},
"annotate": {
"enable": false,
"query": "*",
"size": 20,
"field": "_type",
"sort": [
"_score",
"desc"
]
},
"pointradius": 5,
"show_query": true,
"legend_counts": true,
"zerofill": true,
"derivative": false,
"scale": 1,
"y_as_bytes": false,
"grid": {
"max": null,
"min": 0
},
"y_format": "none"
},
{
"error": false,
"span": 4,
"editable": true,
"type": "terms",
"loadingEditor": false,
"field": "_type",
"exclude": [],
"missing": false,
"other": true,
"size": 10,
"order": "count",
"style": {
"font-size": "10pt"
},
"donut": false,
"tilt": false,
"labels": true,
"arrangement": "horizontal",
"chart": "bar",
"counter_pos": "below",
"spyable": true,
"queries": {
"mode": "all",
"ids": [
0
]
},
"tmode": "terms",
"tstat": "total",
"valuefield": "",
"title": "Category"
}
],
"notice": false
},
{
"title": "Events",
"height": "350px",
"editable": true,
"collapse": false,
"collapsable": true,
"panels": [
{
"title": "All events",
"error": false,
"span": 12,
"editable": true,
"group": [
"default"
],
"type": "table",
"size": 100,
"pages": 5,
"offset": 0,
"sort": [
"utctimestamp",
"desc"
],
"style": {
"font-size": "9pt"
},
"overflow": "min-height",
"fields": [
"utctimestamp",
"summary",
"details.cluster_client_ip",
"details.destinationipaddress",
"details.rx_hosts",
"details.sourceipaddress"
],
"highlight": [],
"sortable": true,
"header": true,
"paging": true,
"spyable": true,
"queries": {
"mode": "all",
"ids": [
0
]
},
"field_list": true,
"status": "Stable",
"trimFactor": 500,
"normTimes": true,
"all_fields": false,
"localTime": false,
"timeField": "@timestamp"
}
],
"notice": false
}
],
"editable": true,
"failover": true,
"index": {
"interval": "day",
"pattern": "[events-]YYYYMMDD",
"default": "events",
"warm_fields": true
},
"style": "dark",
"panel_hints": true,
"pulldowns": [
{
"type": "query",
"collapse": false,
"notice": false,
"query": "*",
"pinned": true,
"history": [
"",
"bro intel",
"\"bro intel\"",
"*",
"jbryner"
],
"remember": 10,
"enable": true
},
{
"type": "filtering",
"collapse": false,
"notice": true,
"enable": true
}
],
"nav": [
{
"type": "timepicker",
"collapse": false,
"notice": false,
"status": "Stable",
"time_options": [
"5m",
"15m",
"30m",
"1h",
"2h",
"3h",
"6h",
"12h",
"24h"
],
"refresh_intervals": [
"5s",
"10s",
"30s",
"1m",
"5m",
"15m"
],
"timefield": "utctimestamp",
"now": true,
"filter_id": 0,
"enable": true
}
],
"loader": {
"save_gist": false,
"save_elasticsearch": true,
"save_local": true,
"save_default": true,
"save_temp": true,
"save_temp_ttl_enable": true,
"save_temp_ttl": "30d",
"load_gist": true,
"load_elasticsearch": true,
"load_elasticsearch_size": 20,
"load_local": true,
"hide": false
},
"refresh": false
}

Просмотреть файл

@ -1,48 +0,0 @@
#!/usr/bin/env python
# This Source Code Form is subject to the terms of the Mozilla Public
# License, v. 2.0. If a copy of the MPL was not distributed with this
# file, You can obtain one at http://mozilla.org/MPL/2.0/.
# Copyright (c) 2014 Mozilla Corporation
#
# Contributors:
# Anthony Verez averez@mozilla.com
from lib.alerttask import AlertTask
import pyes
class AlertBroIntel(AlertTask):
def main(self):
# look for events in last 30 mins
date_timedelta = dict(minutes=30)
# Configure filters using pyes
must = [
pyes.TermFilter('_type', 'event'),
pyes.TermFilter('category', 'brointel'),
pyes.ExistsFilter('seenindicator')
]
self.filtersManual(date_timedelta, must=must)
# Search aggregations on field 'seenindicator', keep 50 samples of events at most
self.searchEventsAggregated('details.seenindicator', samplesLimit=50)
# alert when >= 5 matching events in an aggregation
self.walkAggregations(threshold=5)
# Set alert properties
def onAggregation(self, aggreg):
# aggreg['count']: number of items in the aggregation, ex: number of failed login attempts
# aggreg['value']: value of the aggregation field, ex: toto@example.com
# aggreg['events']: list of events in the aggregation
category = 'bro'
tags = ['bro']
severity = 'NOTICE'
summary = ('{0} {1}: {2}'.format(aggreg['count'], 'bro intel match', aggreg['value']))
# append first 3 source IPs
summary += ' sample sourceips: '
for e in aggreg['events'][:3]:
if 'sourceipaddress' in e['_source']['details'].keys():
summary += '{0} '.format(e['_source']['details']['sourceipaddress'])
# Create the alert object based on these properties
return self.createAlertDict(summary, category, tags, aggreg['events'], severity)

Просмотреть файл

@ -1,303 +0,0 @@
{
"title": "",
"services": {
"query": {
"idQueue": [
1
],
"list": {
"0": {
"query": "*",
"alias": "",
"color": "#7EB26D",
"id": 0,
"pin": false,
"type": "lucene",
"enable": true
}
},
"ids": [
0
]
},
"filter": {
"idQueue": [
1
],
"list": {
"0": {
"type": "time",
"field": "utctimestamp",
"from": "now-6h",
"to": "now",
"mandate": "must",
"active": true,
"alias": "",
"id": 0
},
"1": {
"type": "field",
"field": "category",
"query": "\"bronotice\"",
"mandate": "must",
"active": true,
"alias": "",
"id": 1
},
"2": {
"type": "field",
"field": "details.note",
"query": "\"PacketFilter::Dropped_Packets\"",
"mandate": "mustNot",
"active": true,
"alias": "",
"id": 2
},
"3": {
"type": "field",
"field": "details.note",
"query": "\"CaptureLoss::Too_Much_Loss\"",
"mandate": "mustNot",
"active": true,
"alias": "",
"id": 3
},
"4": {
"type": "field",
"field": "details.note",
"query": "\"Weird::Activity\"",
"mandate": "mustNot",
"active": true,
"alias": "",
"id": 4
}
},
"ids": [
0,
1,
2,
3,
4
]
}
},
"rows": [
{
"title": "Graph",
"height": "350px",
"editable": true,
"collapse": false,
"collapsable": true,
"panels": [
{
"span": 12,
"editable": true,
"group": [
"default"
],
"type": "histogram",
"mode": "count",
"time_field": "utctimestamp",
"value_field": null,
"auto_int": true,
"resolution": 100,
"interval": "5m",
"fill": 3,
"linewidth": 3,
"timezone": "browser",
"spyable": true,
"zoomlinks": true,
"bars": false,
"stack": true,
"points": false,
"lines": true,
"legend": true,
"x-axis": true,
"y-axis": true,
"percentage": false,
"interactive": true,
"queries": {
"mode": "all",
"ids": [
0
]
},
"title": "Events over time",
"intervals": [
"auto",
"1s",
"1m",
"5m",
"10m",
"30m",
"1h",
"3h",
"12h",
"1d",
"1w",
"1M",
"1y"
],
"options": true,
"tooltip": {
"value_type": "cumulative",
"query_as_alias": true
},
"annotate": {
"enable": false,
"query": "*",
"size": 20,
"field": "_type",
"sort": [
"_score",
"desc"
]
},
"pointradius": 5,
"show_query": true,
"legend_counts": true,
"zerofill": true,
"derivative": false,
"scale": 1,
"y_as_bytes": false,
"grid": {
"max": null,
"min": 0
},
"y_format": "none"
}
],
"notice": false
},
{
"title": "Events",
"height": "350px",
"editable": true,
"collapse": false,
"collapsable": true,
"panels": [
{
"title": "All events",
"error": false,
"span": 12,
"editable": true,
"group": [
"default"
],
"type": "table",
"size": 100,
"pages": 5,
"offset": 100,
"sort": [
"utctimestamp",
"desc"
],
"style": {
"font-size": "9pt"
},
"overflow": "min-height",
"fields": [
"utctimestamp",
"summary",
"details.sourceipaddress",
"details.destinationipaddress"
],
"highlight": [],
"sortable": true,
"header": true,
"paging": true,
"spyable": true,
"queries": {
"mode": "all",
"ids": [
0
]
},
"field_list": true,
"status": "Stable",
"trimFactor": 300,
"normTimes": true,
"all_fields": false,
"localTime": false,
"timeField": "@timestamp"
}
],
"notice": false
}
],
"editable": true,
"failover": false,
"index": {
"interval": "day",
"pattern": "[events-]YYYYMMDD",
"default": "NO_TIME_FILTER_OR_INDEX_PATTERN_NOT_MATCHED",
"warm_fields": true
},
"style": "dark",
"panel_hints": true,
"pulldowns": [
{
"type": "query",
"collapse": false,
"notice": false,
"query": "*",
"pinned": true,
"history": [
"*",
"*brointel*"
],
"remember": 10,
"enable": true
},
{
"type": "filtering",
"collapse": false,
"notice": true,
"enable": true
}
],
"nav": [
{
"type": "timepicker",
"collapse": false,
"notice": false,
"status": "Stable",
"time_options": [
"5m",
"15m",
"1h",
"2h",
"6h",
"12h",
"24h"
],
"refresh_intervals": [
"10s",
"30s",
"1m",
"5m",
"15m",
"30m"
],
"timefield": "utctimestamp",
"now": true,
"filter_id": 0,
"enable": true
}
],
"loader": {
"save_gist": false,
"save_elasticsearch": true,
"save_local": true,
"save_default": true,
"save_temp": true,
"save_temp_ttl_enable": true,
"save_temp_ttl": "30d",
"load_gist": true,
"load_elasticsearch": true,
"load_elasticsearch_size": 20,
"load_local": true,
"hide": false
},
"refresh": false
}

Просмотреть файл

@ -0,0 +1,2 @@
[options]
skiphosts = 11.22.33.44 55.66.77.88

Просмотреть файл

@ -3,24 +3,40 @@
# This Source Code Form is subject to the terms of the Mozilla Public
# License, v. 2.0. If a copy of the MPL was not distributed with this
# file, You can obtain one at http://mozilla.org/MPL/2.0/.
# Copyright (c) 2014 Mozilla Corporation
# Copyright (c) 2017 Mozilla Corporation
#
# Contributors:
# Anthony Verez averez@mozilla.com
# Jeff Bryner jbryner@mozilla.com
# Brandon Myers bmyers@mozilla.com
from lib.alerttask import AlertTask
from query_models import SearchQuery, TermMatch, PhraseMatch, TermsMatch
class AlertBruteforceSsh(AlertTask):
def main(self):
# look for events in last 15 mins
date_timedelta = dict(minutes=15)
# Configure filters by importing a kibana dashboard
self.filtersFromKibanaDash('bruteforce_ssh_dashboard.json', date_timedelta)
self.parse_config('bruteforce_ssh.conf', ['skiphosts'])
# Search aggregations on field 'sourceipaddress', keep 50 samples of events at most
self.searchEventsAggregated('sourceipaddress', samplesLimit=50)
# alert when >= 5 matching events in an aggregation
self.walkAggregations(threshold=2)
search_query = SearchQuery(minutes=2)
search_query.add_must([
TermMatch('_type', 'event'),
PhraseMatch('summary', 'failed'),
TermMatch('details.program', 'sshd'),
TermsMatch('summary', ['login', 'invalid', 'ldap_count_entries'])
])
for ip_address in self.config.skiphosts.split():
search_query.add_must_not(PhraseMatch('summary', ip_address))
self.filtersManual(search_query)
# Search aggregations on field 'sourceipaddress', keep X samples of
# events at most
self.searchEventsAggregated('details.sourceipaddress', samplesLimit=10)
# alert when >= X matching events in an aggregation
self.walkAggregations(threshold=10)
# Set alert properties
def onAggregation(self, aggreg):
@ -32,10 +48,9 @@ class AlertBruteforceSsh(AlertTask):
severity = 'NOTICE'
summary = ('{0} ssh bruteforce attempts by {1}'.format(aggreg['count'], aggreg['value']))
# append first 3 hostnames
for e in aggreg['events'][:3]:
if 'details' in e.keys() and 'hostname' in e['details']:
summary += ' on {0}'.format(e['_source']['details']['hostname'])
hosts = self.mostCommon(aggreg['allevents'], '_source.details.hostname')
for i in hosts[:5]:
summary += ' {0} ({1} hits)'.format(i[0], i[1])
# Create the alert object based on these properties
return self.createAlertDict(summary, category, tags, aggreg['events'], severity)
return self.createAlertDict(summary, category, tags, aggreg['events'], severity)

Просмотреть файл

@ -1,344 +0,0 @@
{
"title": "",
"services": {
"query": {
"idQueue": [
1
],
"list": {
"0": {
"query": "*",
"alias": "",
"color": "#7EB26D",
"id": 0,
"pin": false,
"type": "lucene",
"enable": true
}
},
"ids": [
0
]
},
"filter": {
"idQueue": [
1
],
"list": {
"0": {
"type": "time",
"field": "utctimestamp",
"from": "now-12h",
"to": "now",
"mandate": "must",
"active": true,
"alias": "",
"id": 0
},
"1": {
"type": "querystring",
"query": "_type:event",
"mandate": "must",
"active": true,
"alias": "",
"id": 1
},
"2": {
"type": "querystring",
"query": "_exists_:details.sourceipaddress",
"mandate": "must",
"active": true,
"alias": "",
"id": 2
},
"3": {
"type": "querystring",
"query": "eventsource:systemslogs",
"mandate": "must",
"active": true,
"alias": "",
"id": 3
},
"4": {
"type": "querystring",
"query": "summary:\"failed\"",
"mandate": "must",
"active": true,
"alias": "",
"id": 4
},
"5": {
"type": "querystring",
"query": "details.program:sshd",
"mandate": "must",
"active": true,
"alias": "",
"id": 5
},
"6": {
"type": "querystring",
"query": "summary:(login ldap_count_entries)",
"mandate": "must",
"active": true,
"alias": "",
"id": 6
},
"7": {
"type": "querystring",
"query": "summary:\"10.22.8.128\"",
"mandate": "mustNot",
"active": true,
"alias": "",
"id": 7
},
"8": {
"type": "querystring",
"query": "summary:\"10.8.75.35\"",
"mandate": "mustNot",
"active": true,
"alias": "",
"id": 8
},
"9": {
"type": "querystring",
"query": "summary:\"208.118.237.\"",
"mandate": "mustNot",
"active": true,
"alias": "",
"id": 9
}
},
"ids": [
0,
1,
2,
3,
4,
5,
6,
7,
8,
9
]
}
},
"rows": [
{
"title": "Graph",
"height": "350px",
"editable": true,
"collapse": false,
"collapsable": true,
"panels": [
{
"span": 12,
"editable": true,
"group": [
"default"
],
"type": "histogram",
"mode": "count",
"time_field": "utctimestamp",
"value_field": null,
"auto_int": true,
"resolution": 100,
"interval": "5m",
"fill": 3,
"linewidth": 3,
"timezone": "browser",
"spyable": true,
"zoomlinks": true,
"bars": false,
"stack": true,
"points": false,
"lines": true,
"legend": true,
"x-axis": true,
"y-axis": true,
"percentage": false,
"interactive": true,
"queries": {
"mode": "all",
"ids": [
0
]
},
"title": "Events over time",
"intervals": [
"auto",
"1s",
"1m",
"5m",
"10m",
"30m",
"1h",
"3h",
"12h",
"1d",
"1w",
"1M",
"1y"
],
"options": true,
"tooltip": {
"value_type": "cumulative",
"query_as_alias": true
},
"annotate": {
"enable": false,
"query": "*",
"size": 20,
"field": "_type",
"sort": [
"_score",
"desc"
]
},
"pointradius": 5,
"show_query": true,
"legend_counts": true,
"zerofill": true,
"derivative": false,
"scale": 1,
"y_as_bytes": false,
"grid": {
"max": null,
"min": 0
},
"y_format": "none"
}
],
"notice": false
},
{
"title": "Events",
"height": "350px",
"editable": true,
"collapse": false,
"collapsable": true,
"panels": [
{
"title": "All events",
"error": false,
"span": 12,
"editable": true,
"group": [
"default"
],
"type": "table",
"size": 100,
"pages": 5,
"offset": 0,
"sort": [
"utctimestamp",
"desc"
],
"style": {
"font-size": "9pt"
},
"overflow": "min-height",
"fields": [
"utctimestamp",
"summary",
"details.cluster_client_ip",
"details.uri"
],
"highlight": [],
"sortable": true,
"header": true,
"paging": true,
"spyable": true,
"queries": {
"mode": "all",
"ids": [
0
]
},
"field_list": true,
"status": "Stable",
"trimFactor": 300,
"normTimes": true,
"all_fields": true,
"localTime": false,
"timeField": "@timestamp"
}
],
"notice": false
}
],
"editable": true,
"failover": false,
"index": {
"interval": "day",
"pattern": "[events-]YYYYMMDD",
"default": "NO_TIME_FILTER_OR_INDEX_PATTERN_NOT_MATCHED",
"warm_fields": true
},
"style": "dark",
"panel_hints": true,
"pulldowns": [
{
"type": "query",
"collapse": false,
"notice": false,
"query": "*",
"pinned": true,
"history": [
"*",
"*brointel*"
],
"remember": 10,
"enable": true
},
{
"type": "filtering",
"collapse": false,
"notice": true,
"enable": true
}
],
"nav": [
{
"type": "timepicker",
"collapse": false,
"notice": false,
"status": "Stable",
"time_options": [
"5m",
"15m",
"1h",
"2h",
"6h",
"12h",
"24h"
],
"refresh_intervals": [
"10s",
"30s",
"1m",
"5m",
"15m",
"30m"
],
"timefield": "utctimestamp",
"now": true,
"filter_id": 0,
"enable": true
}
],
"loader": {
"save_gist": false,
"save_elasticsearch": true,
"save_local": true,
"save_default": true,
"save_temp": true,
"save_temp_ttl_enable": true,
"save_temp_ttl": "30d",
"load_gist": true,
"load_elasticsearch": true,
"load_elasticsearch_size": 20,
"load_local": true,
"hide": false
},
"refresh": false
}

Просмотреть файл

@ -1,52 +0,0 @@
#!/usr/bin/env python
# This Source Code Form is subject to the terms of the Mozilla Public
# License, v. 2.0. If a copy of the MPL was not distributed with this
# file, You can obtain one at http://mozilla.org/MPL/2.0/.
# Copyright (c) 2014 Mozilla Corporation
#
# Contributors:
# Anthony Verez averez@mozilla.com
# Jeff Bryner jbryner@mozilla.com
from lib.alerttask import AlertTask
import pyes
class AlertBruteforceSsh(AlertTask):
def main(self):
# look for events in last X mins
date_timedelta = dict(minutes=2)
# Configure filters using pyes
must = [
pyes.TermFilter('_type', 'event'),
pyes.QueryFilter(pyes.MatchQuery('summary','failed','phrase')),
pyes.TermFilter('program','sshd'),
pyes.QueryFilter(pyes.MatchQuery('summary', 'login invalid ldap_count_entries', 'boolean'))
]
must_not = [
pyes.QueryFilter(pyes.MatchQuery('summary','10.22.75.203','phrase')),
pyes.QueryFilter(pyes.MatchQuery('summary','10.8.75.144','phrase'))
]
self.filtersManual(date_timedelta, must=must, must_not=must_not)
# Search aggregations on field 'sourceipaddress', keep X samples of events at most
self.searchEventsAggregated('details.sourceipaddress', samplesLimit=10)
# alert when >= X matching events in an aggregation
self.walkAggregations(threshold=10)
# Set alert properties
def onAggregation(self, aggreg):
# aggreg['count']: number of items in the aggregation, ex: number of failed login attempts
# aggreg['value']: value of the aggregation field, ex: toto@example.com
# aggreg['events']: list of events in the aggregation
category = 'bruteforce'
tags = ['ssh']
severity = 'NOTICE'
summary = ('{0} ssh bruteforce attempts by {1}'.format(aggreg['count'], aggreg['value']))
hosts = self.mostCommon(aggreg['allevents'],'_source.details.hostname')
for i in hosts[:5]:
summary += ' {0} ({1} hits)'.format(i[0], i[1])
# Create the alert object based on these properties
return self.createAlertDict(summary, category, tags, aggreg['events'], severity)

Просмотреть файл

@ -0,0 +1,2 @@
[options]
url = https://www.mozilla.org

Просмотреть файл

@ -0,0 +1,47 @@
#!/usr/bin/env python
# This Source Code Form is subject to the terms of the Mozilla Public
# License, v. 2.0. If a copy of the MPL was not distributed with this
# file, You can obtain one at http://mozilla.org/MPL/2.0/.
# Copyright (c) 2017 Mozilla Corporation
#
# Contributors:
# Michal Purzynski michal@mozilla.com
# Brandon Myers bmyers@mozilla.com
from lib.alerttask import AlertTask
from query_models import SearchQuery, TermMatch, ExistsMatch, PhraseMatch
class AlertBugzillaPBruteforce(AlertTask):
def main(self):
self.parse_config('bugzillaauthbruteforce.conf', ['url'])
search_query = SearchQuery(minutes=15)
search_query.add_must([
TermMatch('_type', 'bro'),
TermMatch('eventsource', 'nsm'),
TermMatch('category', 'bronotice'),
ExistsMatch('details.sourceipaddress'),
PhraseMatch('details.note', 'BugzBruteforcing::HTTP_BugzBruteforcing_Attacker'),
])
self.filtersManual(search_query)
# Search events
self.searchEventsSimple()
self.walkEvents()
# Set alert properties
def onEvent(self, event):
category = 'httperrors'
tags = ['http']
severity = 'NOTICE'
hostname = event['_source']['hostname']
url = self.config.url
# the summary of the alert is the same as the event
summary = '{0} {1}'.format(hostname, event['_source']['summary'])
# Create the alert object based on these properties
return self.createAlertDict(summary, category, tags, [event], severity=severity, url=url)

Просмотреть файл

@ -2,17 +2,13 @@ from celery import Celery
from lib.config import ALERTS, LOGGING, RABBITMQ
from logging.config import dictConfig
print ALERTS
# Alert files to include
alerts_include = []
for alert in ALERTS.keys():
alerts_include.append('.'.join((alert).split('.')[:-1]))
alerts_include = list(set(alerts_include))
print alerts_include
BROKER_URL = 'amqp://{0}:{1}@{2}:{3}//'.format(
BROKER_URL = 'amqp://{0}:{1}@{2}:{3}//'.format(
RABBITMQ['mquser'],
RABBITMQ['mqpassword'],
RABBITMQ['mqserver'],

Просмотреть файл

@ -1,44 +0,0 @@
#!/usr/bin/env python
# This Source Code Form is subject to the terms of the Mozilla Public
# License, v. 2.0. If a copy of the MPL was not distributed with this
# file, You can obtain one at http://mozilla.org/MPL/2.0/.
# Copyright (c) 2014 Mozilla Corporation
#
# Contributors:
# Anthony Verez averez@mozilla.com
from lib.alerttask import AlertTask
class AlertCloudtrail(AlertTask):
def main(self):
# look for events in last 160 hours
date_timedelta = dict(hours=160)
# Configure filters by importing a kibana dashboard
self.filtersFromKibanaDash('cloudtrail_dashboard.json', date_timedelta)
# Search events
self.searchEventsSimple()
self.walkEvents()
# Set alert properties
def onEvent(self, event):
category = 'AWSCloudtrail'
tags = ['cloudtrail','aws']
severity = 'INFO'
summary = ('{0} called {1} from {2}'.format(event['_source']['userIdentity']['userName'], event['_source']['eventName'], event['_source']['sourceIPAddress']))
if event['_source']['eventName'] == 'RunInstances':
for i in event['_source']['responseElements']['instancesSet']['items']:
if 'privateDnsName' in i.keys():
summary += (' running {0} '.format(i['privateDnsName']))
elif 'instanceId' in i.keys():
summary += (' running {0} '.format(i['instanceId']))
else:
summary += (' running {0} '.format(flattenDict(i)))
if event['_source']['eventName'] == 'StartInstances':
for i in event['_source']['requestParameters']['instancesSet']['items']:
summary += (' starting {0} '.format(i['instanceId']))
# Create the alert object based on these properties
return self.createAlertDict(summary, category, tags, [event], severity)

Просмотреть файл

@ -1,283 +0,0 @@
{
"title": "",
"services": {
"query": {
"idQueue": [
1
],
"list": {
"0": {
"query": "",
"alias": "",
"color": "#7EB26D",
"id": 0,
"pin": false,
"type": "lucene",
"enable": true
}
},
"ids": [
0
]
},
"filter": {
"idQueue": [
1
],
"list": {
"0": {
"from": "2014-07-09T17:32:02.046Z",
"to": "now",
"type": "time",
"field": "utctimestamp",
"mandate": "must",
"active": true,
"alias": "",
"id": 0
},
"1": {
"type": "querystring",
"query": "_type:cloudtrail",
"mandate": "must",
"active": true,
"alias": "",
"id": 1
},
"2": {
"type": "querystring",
"query": "eventName:(runinstances, stopinstances, startinstances)",
"mandate": "must",
"active": true,
"alias": "",
"id": 2
}
},
"ids": [
0,
1,
2
]
}
},
"rows": [
{
"title": "Graph",
"height": "350px",
"editable": true,
"collapse": false,
"collapsable": true,
"panels": [
{
"span": 12,
"editable": true,
"group": [
"default"
],
"type": "histogram",
"mode": "count",
"time_field": "utctimestamp",
"value_field": null,
"auto_int": true,
"resolution": 100,
"interval": "1h",
"fill": 3,
"linewidth": 3,
"timezone": "browser",
"spyable": true,
"zoomlinks": true,
"bars": false,
"stack": true,
"points": false,
"lines": true,
"legend": true,
"x-axis": true,
"y-axis": true,
"percentage": false,
"interactive": true,
"queries": {
"mode": "all",
"ids": [
0
]
},
"title": "Events over time",
"intervals": [
"auto",
"1s",
"1m",
"5m",
"10m",
"30m",
"1h",
"3h",
"12h",
"1d",
"1w",
"1M",
"1y"
],
"options": true,
"tooltip": {
"value_type": "cumulative",
"query_as_alias": true
},
"annotate": {
"enable": false,
"query": "*",
"size": 20,
"field": "_type",
"sort": [
"_score",
"desc"
]
},
"pointradius": 5,
"show_query": true,
"legend_counts": true,
"zerofill": true,
"derivative": false,
"scale": 1,
"y_as_bytes": false,
"grid": {
"max": null,
"min": 0
},
"y_format": "none"
}
],
"notice": false
},
{
"title": "Events",
"height": "350px",
"editable": true,
"collapse": false,
"collapsable": true,
"panels": [
{
"title": "All events",
"error": false,
"span": 12,
"editable": true,
"group": [
"default"
],
"type": "table",
"size": 100,
"pages": 5,
"offset": 0,
"sort": [
"utctimestamp",
"desc"
],
"style": {
"font-size": "9pt"
},
"overflow": "min-height",
"fields": [
"utctimestamp",
"summary",
"details.cluster_client_ip",
"details.uri"
],
"highlight": [],
"sortable": true,
"header": true,
"paging": true,
"spyable": true,
"queries": {
"mode": "all",
"ids": [
0
]
},
"field_list": true,
"status": "Stable",
"trimFactor": 300,
"normTimes": true,
"all_fields": true,
"localTime": false,
"timeField": "@timestamp"
}
],
"notice": false
}
],
"editable": true,
"failover": false,
"index": {
"interval": "day",
"pattern": "[events-]YYYYMMDD",
"default": "NO_TIME_FILTER_OR_INDEX_PATTERN_NOT_MATCHED",
"warm_fields": true
},
"style": "dark",
"panel_hints": true,
"pulldowns": [
{
"type": "query",
"collapse": false,
"notice": false,
"query": "*",
"pinned": true,
"history": [
"",
"_exists_:alerttimestamp",
"*",
"*brointel*"
],
"remember": 10,
"enable": true
},
{
"type": "filtering",
"collapse": false,
"notice": true,
"enable": true
}
],
"nav": [
{
"type": "timepicker",
"collapse": false,
"notice": false,
"status": "Stable",
"time_options": [
"5m",
"15m",
"1h",
"2h",
"6h",
"12h",
"24h"
],
"refresh_intervals": [
"10s",
"30s",
"1m",
"5m",
"15m",
"30m"
],
"timefield": "utctimestamp",
"now": true,
"filter_id": 0,
"enable": true
}
],
"loader": {
"save_gist": false,
"save_elasticsearch": true,
"save_local": true,
"save_default": true,
"save_temp": true,
"save_temp_ttl_enable": true,
"save_temp_ttl": "30d",
"load_gist": true,
"load_elasticsearch": true,
"load_elasticsearch_size": 20,
"load_local": true,
"hide": false
},
"refresh": false
}

Просмотреть файл

@ -0,0 +1,38 @@
#!/usr/bin/env python
# This Source Code Form is subject to the terms of the Mozilla Public
# License, v. 2.0. If a copy of the MPL was not distributed with this
# file, You can obtain one at http://mozilla.org/MPL/2.0/.
# Copyright (c) 2017 Mozilla Corporation
#
# Contributors:
# Brandon Myers bmyers@mozilla.com
from lib.alerttask import AlertTask
from query_models import SearchQuery, TermMatch
class AlertCloudtrailDeadman(AlertTask):
def main(self):
search_query = SearchQuery(hours=1)
search_query.add_must([
TermMatch('_type', 'cloudtrail'),
])
self.filtersManual(search_query)
self.searchEventsSimple()
self.walkEvents()
# Set alert properties
# if no events found
def onNoEvent(self):
category = 'deadman'
tags = ['cloudtrail', 'aws']
severity = 'ERROR'
summary = 'No cloudtrail events found the last hour'
# Create the alert object based on these properties
return self.createAlertDict(summary, category, tags, [], severity=severity)

Просмотреть файл

@ -0,0 +1,37 @@
#!/usr/bin/env python
# This Source Code Form is subject to the terms of the Mozilla Public
# License, v. 2.0. If a copy of the MPL was not distributed with this
# file, You can obtain one at http://mozilla.org/MPL/2.0/.
# Copyright (c) 2017 Mozilla Corporation
#
# Contributors:
# Brandon Myers bmyers@mozilla.com
from lib.alerttask import AlertTask
from query_models import SearchQuery, TermMatch
class AlertCloudtrailLoggingDisabled(AlertTask):
def main(self):
search_query = SearchQuery(minutes=30)
search_query.add_must([
TermMatch('_type', 'cloudtrail'),
TermMatch('eventName', 'StopLogging'),
])
search_query.add_must_not(TermMatch('errorCode', 'AccessDenied'))
self.filtersManual(search_query)
self.searchEventsSimple()
self.walkEvents()
def onEvent(self, event):
category = 'AWSCloudtrail'
tags = ['cloudtrail', 'aws', 'cloudtrailpagerduty']
severity = 'CRITICAL'
summary = 'Cloudtrail Logging Disabled: ' + event['_source']['requestParameters']['name']
return self.createAlertDict(summary, category, tags, [event], severity)

Просмотреть файл

@ -1,49 +0,0 @@
#!/usr/bin/env python
# This Source Code Form is subject to the terms of the Mozilla Public
# License, v. 2.0. If a copy of the MPL was not distributed with this
# file, You can obtain one at http://mozilla.org/MPL/2.0/.
# Copyright (c) 2014 Mozilla Corporation
#
# Contributors:
# Anthony Verez averez@mozilla.com
from lib.alerttask import AlertTask
import pyes
class AlertCloudtrail(AlertTask):
def main(self):
# look for events in last x hours
date_timedelta = dict(hours=1)
# Configure filters using pyes
must = [
pyes.TermFilter('_type', 'cloudtrail'),
pyes.TermsFilter('eventName',['runinstances','stopinstances','startinstances'])
]
self.filtersManual(date_timedelta, must=must)
# Search events
self.searchEventsSimple()
self.walkEvents()
# Set alert properties
def onEvent(self, event):
category = 'AWSCloudtrail'
tags = ['cloudtrail','aws']
severity = 'INFO'
summary = ('{0} called {1} from {2}'.format(event['_source']['userIdentity']['userName'], event['_source']['eventName'], event['_source']['sourceIPAddress']))
if event['_source']['eventName'] == 'RunInstances':
for i in event['_source']['responseElements']['instancesSet']['items']:
if 'privateDnsName' in i.keys():
summary += (' running {0} '.format(i['privateDnsName']))
elif 'instanceId' in i.keys():
summary += (' running {0} '.format(i['instanceId']))
else:
summary += (' running {0} '.format(flattenDict(i)))
if event['_source']['eventName'] == 'StartInstances':
for i in event['_source']['requestParameters']['instancesSet']['items']:
summary += (' starting {0} '.format(i['instanceId']))
# Create the alert object based on these properties
return self.createAlertDict(summary, category, tags, [event], severity)

Просмотреть файл

@ -0,0 +1,49 @@
#!/usr/bin/env python
# This Source Code Form is subject to the terms of the Mozilla Public
# License, v. 2.0. If a copy of the MPL was not distributed with this
# file, You can obtain one at http://mozilla.org/MPL/2.0/.
# Copyright (c) 2014 Mozilla Corporation
#
# Contributors:
# Jonathan Claudius jclaudius@mozilla.com
from lib.alerttask import AlertTask
from query_models import SearchQuery, TermMatch, QueryStringMatch
class AlertConfluenceShellUsage(AlertTask):
def main(self):
# look for events in last X mins
search_query = SearchQuery(minutes=5)
search_query.add_must([
TermMatch('_type', 'auditd'),
TermMatch('details.user', 'confluence'),
QueryStringMatch('hostname: /.*(mana|confluence).*/'),
])
search_query.add_must_not(TermMatch('details.originaluser', 'root'))
self.filtersManual(search_query)
# Search aggregations on field 'sourceipaddress', keep X samples of
# events at most
self.searchEventsAggregated('hostname', samplesLimit=10)
# alert when >= X matching events in an aggregation
# in this case, always
self.walkAggregations(threshold=1)
# Set alert properties
def onAggregation(self, aggreg):
# aggreg['count']: number of items in the aggregation, ex: number of failed login attempts
# aggreg['value']: value of the aggregation field, ex: toto@example.com
# aggreg['events']: list of events in the aggregation
category = 'intrusion'
tags = ['confluence', 'mana']
severity = 'CRITICAL'
summary = 'Confluence user is running shell commands on {0}'.format(aggreg['value'])
# Create the alert object based on these properties
return self.createAlertDict(summary, category, tags, aggreg['events'], severity)

Просмотреть файл

@ -0,0 +1,2 @@
[options]
url = https://www.mozilla.org

Просмотреть файл

@ -3,27 +3,30 @@
# This Source Code Form is subject to the terms of the Mozilla Public
# License, v. 2.0. If a copy of the MPL was not distributed with this
# file, You can obtain one at http://mozilla.org/MPL/2.0/.
# Copyright (c) 2014 Mozilla Corporation
# Copyright (c) 2017 Mozilla Corporation
#
# Contributors:
# Michal Purzynski michal@mozilla.com
# Brandon Myers bmyers@mozilla.com
from lib.alerttask import AlertTask
import pyes
from query_models import SearchQuery, TermMatch, ExistsMatch, PhraseMatch
class AlertCorrelatedIntelNotice(AlertTask):
def main(self):
self.parse_config('correlated_alerts.conf', ['url'])
# look for events in last 15 mins
date_timedelta = dict(minutes=15)
# Configure filters using pyes
must = [
pyes.TermFilter('_type', 'bro'),
pyes.TermFilter('eventsource', 'nsm'),
pyes.TermFilter('category', 'bronotice'),
pyes.ExistsFilter('details.sourceipaddress'),
pyes.QueryFilter(pyes.MatchQuery('details.note','CrowdStrike::Correlated_Alerts','phrase'))
]
self.filtersManual(date_timedelta, must=must)
search_query = SearchQuery(minutes=15)
search_query.add_must([
TermMatch('_type', 'bro'),
TermMatch('eventsource', 'nsm'),
TermMatch('category', 'bronotice'),
ExistsMatch('details.sourceipaddress'),
PhraseMatch('details.note', 'CrowdStrike::Correlated_Alerts')
])
self.filtersManual(search_query)
# Search events
self.searchEventsSimple()
@ -35,11 +38,10 @@ class AlertCorrelatedIntelNotice(AlertTask):
tags = ['nsm,bro,correlated']
severity = 'NOTICE'
hostname = event['_source']['hostname']
url = "https://mana.mozilla.org/wiki/display/SECURITY/NSM+IR+procedures"
url = self.config.url
# the summary of the alert is the same as the event
summary = '{0} {1}'.format(hostname, event['_source']['summary'])
# Create the alert object based on these properties
return self.createAlertDict(summary, category, tags, [event], severity=severity, url=url)

2
alerts/deadman.conf Normal file
Просмотреть файл

@ -0,0 +1,2 @@
[options]
url = https://www.mozilla.org

53
alerts/deadman.py Normal file
Просмотреть файл

@ -0,0 +1,53 @@
#!/usr/bin/env python
# This Source Code Form is subject to the terms of the Mozilla Public
# License, v. 2.0. If a copy of the MPL was not distributed with this
# file, You can obtain one at http://mozilla.org/MPL/2.0/.
# Copyright (c) 2017 Mozilla Corporation
#
# Contributors:
# Jeff Bryner jbryner@mozilla.com
# Brandon Myers bmyers@mozilla.com
#
# a collection of alerts looking for the lack of events
# to alert on a dead input source.
from lib.alerttask import AlertTask
from query_models import SearchQuery, TermMatch, PhraseMatch
class broNSM(AlertTask):
def main(self, *args, **kwargs):
self.parse_config('deadman.conf', ['url'])
# call with hostlist=['host1','host2','host3']
# to search for missing events
if kwargs and 'hostlist' in kwargs.keys():
for host in kwargs['hostlist']:
self.log.debug('checking deadman for host: {0}'.format(host))
search_query = SearchQuery(minutes=20)
search_query.add_must([
PhraseMatch("details.note", "MozillaAlive::Bro_Is_Watching_You"),
PhraseMatch("details.peer_descr", host),
TermMatch('category', 'bronotice'),
TermMatch('_type', 'bro')
])
self.filtersManual(search_query)
# Search events
self.searchEventsSimple()
self.walkEvents(hostname=host)
# Set alert properties
# if no events found
def onNoEvent(self, hostname):
category = 'deadman'
tags = ['bro']
severity = 'ERROR'
summary = ('No {0} bro healthcheck events found the past 20 minutes'.format(hostname))
url = self.config.url
# Create the alert object based on these properties
return self.createAlertDict(summary, category, tags, [], severity=severity, url=url)

2
alerts/duo_authfail.conf Normal file
Просмотреть файл

@ -0,0 +1,2 @@
[options]
url = https://www.mozilla.org

54
alerts/duo_authfail.py Normal file
Просмотреть файл

@ -0,0 +1,54 @@
# This Source Code Form is subject to the terms of the Mozilla Public
# License, v. 2.0. If a copy of the MPL was not distributed with this
# file, You can obtain one at http://mozilla.org/MPL/2.0/.
# Copyright (c) 2017 Mozilla Corporation
#
# Contributors:
# Anthony Verez averez@mozilla.com
# Jeff Bryner jbryner@mozilla.com
# Aaron Meihm ameihm@mozilla.com
# Michal Purzynski <mpurzynski@mozilla.com>
# Alicia Smith <asmith@mozilla.com>
# Brandon Myers bmyers@mozilla.com
from lib.alerttask import AlertTask
from query_models import SearchQuery, TermMatch, ExistsMatch, PhraseMatch
class AlertDuoAuthFail(AlertTask):
def main(self):
self.parse_config('duo_authfail.conf', ['url'])
search_query = SearchQuery(minutes=15)
search_query.add_must([
TermMatch('_type', 'event'),
TermMatch('category', 'event'),
ExistsMatch('details.sourceipaddress'),
ExistsMatch('details.username'),
PhraseMatch('details.result', 'FRAUD'),
])
self.filtersManual(search_query)
self.searchEventsSimple()
self.walkEvents()
# Set alert properties
def onEvent(self, event):
category = 'duosecurity'
tags = ['duosecurity', 'duosecuritypagerduty']
severity = 'WARNING'
url = self.config.url
sourceipaddress = 'unknown'
user = 'unknown'
x = event['_source']
if 'details' in x:
if 'sourceipaddress' in x['details']:
sourceipaddress = x['details']['sourceipaddress']
if 'username' in x['details']:
user = x['details']['username']
summary = 'Duo Authentication Failure: user {1} rejected and marked a Duo Authentication attempt from {0} as fraud'.format(sourceipaddress, user)
# Create the alert object based on these properties
return self.createAlertDict(summary, category, tags, [event], severity, url)

Просмотреть файл

@ -13,19 +13,19 @@
# this case a VPN certificate)
from lib.alerttask import AlertTask
import pyes
from query_models import SearchQuery, PhraseMatch
class AlertDuoFailOpen(AlertTask):
def main(self):
# look for events in last 15 mins
date_timedelta = dict(minutes=15)
# Configure filters using pyes
must = [
pyes.QueryFilter(pyes.MatchQuery('summary','Failsafe Duo login','phrase'))
]
self.filtersManual(date_timedelta, must=must)
search_query = SearchQuery(minutes=15)
# Search aggregations on field 'sourceipaddress', keep X samples of events at most
search_query.add_must(PhraseMatch('summary', 'Failsafe Duo login'))
self.filtersManual(search_query)
# Search aggregations on field 'sourceipaddress', keep X samples of
# events at most
self.searchEventsAggregated('details.hostname', samplesLimit=10)
# alert when >= X matching events in an aggregation
# in this case, always
@ -44,4 +44,3 @@ class AlertDuoFailOpen(AlertTask):
# Create the alert object based on these properties
return self.createAlertDict(summary, category, tags, aggreg['events'], severity)

Просмотреть файл

@ -1,292 +0,0 @@
{
"title": "",
"services": {
"query": {
"idQueue": [
1
],
"list": {
"0": {
"query": "",
"alias": "",
"color": "#7EB26D",
"id": 0,
"pin": false,
"type": "lucene",
"enable": true
}
},
"ids": [
0
]
},
"filter": {
"idQueue": [
1
],
"list": {
"0": {
"type": "time",
"field": "utctimestamp",
"from": "now-6h",
"to": "now",
"mandate": "must",
"active": true,
"alias": "",
"id": 0
},
"1": {
"type": "querystring",
"query": "_type:event",
"mandate": "must",
"active": true,
"alias": "",
"id": 1
},
"2": {
"type": "querystring",
"query": "program:fail2ban",
"mandate": "must",
"active": true,
"alias": "",
"id": 2
},
"3": {
"type": "querystring",
"query": "summary:\"has been banned\"",
"mandate": "must",
"active": true,
"alias": "",
"id": 3
}
},
"ids": [
0,
1,
2,
3
]
}
},
"rows": [
{
"title": "Graph",
"height": "350px",
"editable": true,
"collapse": false,
"collapsable": true,
"panels": [
{
"span": 12,
"editable": true,
"group": [
"default"
],
"type": "histogram",
"mode": "count",
"time_field": "utctimestamp",
"value_field": null,
"auto_int": true,
"resolution": 100,
"interval": "5m",
"fill": 3,
"linewidth": 3,
"timezone": "browser",
"spyable": true,
"zoomlinks": true,
"bars": false,
"stack": true,
"points": false,
"lines": true,
"legend": true,
"x-axis": true,
"y-axis": true,
"percentage": false,
"interactive": true,
"queries": {
"mode": "all",
"ids": [
0
]
},
"title": "Events over time",
"intervals": [
"auto",
"1s",
"1m",
"5m",
"10m",
"30m",
"1h",
"3h",
"12h",
"1d",
"1w",
"1M",
"1y"
],
"options": true,
"tooltip": {
"value_type": "cumulative",
"query_as_alias": true
},
"annotate": {
"enable": false,
"query": "*",
"size": 20,
"field": "_type",
"sort": [
"_score",
"desc"
]
},
"pointradius": 5,
"show_query": true,
"legend_counts": true,
"zerofill": true,
"derivative": false,
"scale": 1,
"y_as_bytes": false,
"grid": {
"max": null,
"min": 0
},
"y_format": "none"
}
],
"notice": false
},
{
"title": "Events",
"height": "350px",
"editable": true,
"collapse": false,
"collapsable": true,
"panels": [
{
"title": "All events",
"error": false,
"span": 12,
"editable": true,
"group": [
"default"
],
"type": "table",
"size": 100,
"pages": 5,
"offset": 0,
"sort": [
"utctimestamp",
"desc"
],
"style": {
"font-size": "9pt"
},
"overflow": "min-height",
"fields": [
"utctimestamp",
"summary",
"details.cluster_client_ip",
"details.uri"
],
"highlight": [],
"sortable": true,
"header": true,
"paging": true,
"spyable": true,
"queries": {
"mode": "all",
"ids": [
0
]
},
"field_list": true,
"status": "Stable",
"trimFactor": 300,
"normTimes": true,
"all_fields": true,
"localTime": false,
"timeField": "@timestamp"
}
],
"notice": false
}
],
"editable": true,
"failover": false,
"index": {
"interval": "day",
"pattern": "[events-]YYYYMMDD",
"default": "NO_TIME_FILTER_OR_INDEX_PATTERN_NOT_MATCHED",
"warm_fields": true
},
"style": "dark",
"panel_hints": true,
"pulldowns": [
{
"type": "query",
"collapse": false,
"notice": false,
"query": "*",
"pinned": true,
"history": [
"",
"_exists_:alerttimestamp",
"*",
"*brointel*"
],
"remember": 10,
"enable": true
},
{
"type": "filtering",
"collapse": false,
"notice": true,
"enable": true
}
],
"nav": [
{
"type": "timepicker",
"collapse": false,
"notice": false,
"status": "Stable",
"time_options": [
"5m",
"15m",
"1h",
"2h",
"6h",
"12h",
"24h"
],
"refresh_intervals": [
"10s",
"30s",
"1m",
"5m",
"15m",
"30m"
],
"timefield": "utctimestamp",
"now": true,
"filter_id": 0,
"enable": true
}
],
"loader": {
"save_gist": false,
"save_elasticsearch": true,
"save_local": true,
"save_default": true,
"save_temp": true,
"save_temp_ttl_enable": true,
"save_temp_ttl": "30d",
"load_gist": true,
"load_elasticsearch": true,
"load_elasticsearch_size": 20,
"load_local": true,
"hide": false
},
"refresh": false
}

Просмотреть файл

@ -1,39 +0,0 @@
#!/usr/bin/env python
# This Source Code Form is subject to the terms of the Mozilla Public
# License, v. 2.0. If a copy of the MPL was not distributed with this
# file, You can obtain one at http://mozilla.org/MPL/2.0/.
# Copyright (c) 2014 Mozilla Corporation
#
# Contributors:
# Anthony Verez averez@mozilla.com
from lib.alerttask import AlertTask
import pyes
class AlertFail2ban(AlertTask):
def main(self):
# look for events in last 10 mins
date_timedelta = dict(minutes=10)
# Configure filters using pyes
must = [
pyes.TermFilter('_type', 'event'),
pyes.TermFilter('program', 'fail2ban'),
pyes.QueryFilter(pyes.MatchQuery("summary","banned for","phrase"))
]
self.filtersManual(date_timedelta, must=must)
# Search events
self.searchEventsSimple()
self.walkEvents()
# Set alert properties
def onEvent(self, event):
category = 'fail2ban'
tags = ['fail2ban']
severity = 'NOTICE'
summary='{0}: {1}'.format(event['_source']['details']['hostname'], event['_source']['summary'].strip())
# Create the alert object based on these properties
return self.createAlertDict(summary, category, tags, [event], severity)

56
alerts/fxaAlerts.py Normal file
Просмотреть файл

@ -0,0 +1,56 @@
#!/usr/bin/env python
# This Source Code Form is subject to the terms of the Mozilla Public
# License, v. 2.0. If a copy of the MPL was not distributed with this
# file, You can obtain one at http://mozilla.org/MPL/2.0/.
# Copyright (c) 2014 Mozilla Corporation
#
# Contributors:
# Jeff Bryner jbryner@mozilla.com
from lib.alerttask import AlertTask
from query_models import SearchQuery, TermMatch, PhraseMatch, WildcardMatch
class AlertAccountCreations(AlertTask):
def main(self):
search_query = SearchQuery(minutes=10)
search_query.add_must([
TermMatch('_type', 'event'),
TermMatch('tags', 'firefoxaccounts'),
PhraseMatch('details.path', '/v1/account/create')
])
# ignore test accounts and attempts to create accounts that already exist.
search_query.add_must_not([
WildcardMatch('details.email', '*restmail.net'),
TermMatch('details.code', '429')
])
self.filtersManual(search_query)
# Search aggregations on field 'sourceipv4address', keep X samples of events at most
self.searchEventsAggregated('details.sourceipv4address', samplesLimit=10)
# alert when >= X matching events in an aggregation
self.walkAggregations(threshold=10)
# Set alert properties
def onAggregation(self, aggreg):
# aggreg['count']: number of items in the aggregation, ex: number of failed login attempts
# aggreg['value']: value of the aggregation field, ex: toto@example.com
# aggreg['events']: list of events in the aggregation
category = 'fxa'
tags = ['fxa']
severity = 'INFO'
summary = ('{0} fxa account creation attempts by {1}'.format(aggreg['count'], aggreg['value']))
emails = self.mostCommon(aggreg['allevents'],'_source.details.email')
#did they try to create more than one email account?
#or just retry an existing one
if len(emails) > 1:
for i in emails[:5]:
summary += ' {0} ({1} hits)'.format(i[0], i[1])
# Create the alert object based on these properties
return self.createAlertDict(summary, category, tags, aggreg['events'], severity)

Просмотреть файл

@ -0,0 +1,2 @@
[options]
alert_data_location = /opt/mozdef/envs/mozdef/alerts/generic_alerts

Просмотреть файл

@ -0,0 +1,162 @@
#!/usr/bin/env python
# This Source Code Form is subject to the terms of the Mozilla Public
# License, v. 2.0. If a copy of the MPL was not distributed with this
# file, You can obtain one at http://mozilla.org/MPL/2.0/.
# Copyright (c) 2017 Mozilla Corporation
#
# Contributors:
# kang@mozilla.com
# bmyers@mozilla.com
# TODO: Dont use query_models, nicer fixes for AlertTask
from lib.alerttask import AlertTask
from query_models import SearchQuery, TermMatch, QueryStringMatch
import hjson
import logging
import sys
import traceback
import glob
import os
# Minimum data needed for an alert (this is an example alert json)
'''
{
# Lucene search string
'search_string': 'field1: matchingvalue and field2: matchingothervalue',
# ES Filters as such: [['field', 'value'], ['field', 'value']]
'filters': [],
# What to aggregate on if we get multiple matches?
'aggregation_key': 'summary',
# How long to search and aggregate for? The longer the slower.
# These defaults work well for alerts that basically don't *need*
# much aggregation
'threshold': {
'timerange_min': 5,
'count': 1
},
# This is the category that will show up in mozdef, and the severity
'alert_category': 'generic_alerts',
'alert_severity': 'INFO',
# This will show up as the alert text when it trigger
'summary': 'Example summary that shows up in the alert',
# This helps sorting out alerts, so it's nice if you fill this in
'tags': ['generic'],
# This is the alert documentation
'url': 'https://mozilla.org'
}
'''
logger = logging.getLogger()
formatter = logging.Formatter('%(asctime)s - %(name)s - %(levelname)s - %(message)s')
class DotDict(dict):
'''dict.item notation for dict()'s'''
__getattr__ = dict.__getitem__
__setattr__ = dict.__setitem__
__delattr__ = dict.__delitem__
def __init__(self, dct):
for key, value in dct.items():
if hasattr(value, 'keys'):
value = DotDict(value)
self[key] = value
class AlertGenericLoader(AlertTask):
required_fields = [
"search_string",
"filters",
"aggregation_key",
"time_window",
"num_samples",
"num_aggregations",
"alert_category",
"alert_tags",
"alert_severity",
"alert_summary",
"alert_url",
]
def validate_alert(self, alert):
for key in self.required_fields:
if key not in alert:
logger.error('Your alert does not have the required field {}'.format(key))
raise KeyError
def load_configs(self):
'''Load all configured rules'''
self.configs = []
rules_location = os.path.join(self.config.alert_data_location, "rules")
files = glob.glob(rules_location + "/*.json")
for f in files:
with open(f) as fd:
try:
cfg = DotDict(hjson.load(fd))
self.validate_alert(cfg)
self.configs.append(cfg)
except Exception:
logger.error("Loading rule file {} failed".format(f))
def process_alert(self, alert_config):
search_query = SearchQuery(minutes=int(alert_config.time_window))
terms = []
for i in alert_config.filters:
terms.append(TermMatch(i[0], i[1]))
terms.append(QueryStringMatch(str(alert_config.search_string)))
search_query.add_must(terms)
self.filtersManual(search_query)
self.searchEventsAggregated(alert_config.aggregation_key, samplesLimit=int(alert_config.num_samples))
self.walkAggregations(threshold=int(alert_config.num_aggregations), config=alert_config)
def main(self):
self.parse_config('generic_alert_loader.conf', ['alert_data_location'])
self.load_configs()
for cfg in self.configs:
try:
self.process_alert(cfg)
except Exception:
traceback.print_exc(file=sys.stdout)
logger.error("Processing rule file {} failed".format(cfg.__str__()))
def onAggregation(self, aggreg):
# aggreg['count']: number of items in the aggregation, ex: number of failed login attempts
# aggreg['value']: value of the aggregation field, ex: toto@example.com
# aggreg['events']: list of events in the aggregation
category = aggreg['config']['alert_category']
tags = aggreg['config']['alert_tags']
severity = aggreg['config']['alert_severity']
url = aggreg['config']['alert_url']
# Find all affected hosts
# Normally, the hostname data is in e.details.hostname so try that first,
# but fall back to e.hostname if it is missing, or nothing at all if there's no hostname! ;-)
hostnames = []
for e in aggreg['events']:
event_source = e['_source']
if 'details' in event_source and 'hostname' in event_source['details']:
hostnames.append(event_source['details']['hostname'])
elif 'hostname' in event_source:
hostnames.append(event_source['hostname'])
summary = '{} ({}): {}'.format(
aggreg['config']['alert_summary'],
aggreg['count'],
aggreg['value'],
)
if hostnames:
summary += ' [{}]'.format(', '.join(hostnames))
return self.createAlertDict(summary, category, tags, aggreg['events'], severity, url)

Просмотреть файл

@ -9,20 +9,22 @@
# Aaron Meihm <ameihm@mozilla.com>
from lib.alerttask import AlertTask
import pyes
from query_models import SearchQuery, TermMatch
class AlertGeomodel(AlertTask):
# The minimum event severity we will create an alert for
MINSEVERITY = 2
def main(self):
date_timedelta = dict(minutes=30)
search_query = SearchQuery(minutes=30)
must = [
pyes.TermFilter('_type', 'event'),
pyes.TermFilter('category', 'geomodelnotice'),
]
self.filtersManual(date_timedelta, must=must, must_not=[])
search_query.add_must([
TermMatch('_type', 'event'),
TermMatch('category', 'geomodelnotice'),
])
self.filtersManual(search_query)
self.searchEventsSimple()
self.walkEvents()
@ -30,7 +32,7 @@ class AlertGeomodel(AlertTask):
def onEvent(self, event):
category = 'geomodel'
tags = ['geomodel']
severity = 'WARNING'
severity = 'NOTICE'
ev = event['_source']
@ -41,5 +43,10 @@ class AlertGeomodel(AlertTask):
if ev['details']['severity'] < self.MINSEVERITY:
return None
# By default we assign a MozDef severity of NOTICE, but up this if the
# geomodel alert is sev 3
if ev['details']['severity'] == 3:
severity = 'WARNING'
summary = ev['summary']
return self.createAlertDict(summary, category, tags, [event], severity)

Просмотреть файл

@ -0,0 +1,47 @@
#!/usr/bin/env python
# This Source Code Form is subject to the terms of the Mozilla Public
# License, v. 2.0. If a copy of the MPL was not distributed with this
# file, You can obtain one at http://mozilla.org/MPL/2.0/.
# Copyright (c) 2014 Mozilla Corporation
#
# Contributors:
# Anthony Verez averez@mozilla.com
# Jeff Bryner jbryner@mozilla.com
from lib.alerttask import AlertTask
from query_models import SearchQuery, TermMatch, ExistsMatch, PhraseMatch
class AlertHostScannerFinding(AlertTask):
def main(self):
search_query = SearchQuery(minutes=15)
search_query.add_must([
TermMatch('_type', 'cef'),
ExistsMatch('details.dhost'),
PhraseMatch("details.signatureid", "sensitivefiles")
])
self.filtersManual(search_query)
# Search aggregations on field 'sourceipaddress', keep X samples of events at most
self.searchEventsAggregated('details.dhost', samplesLimit=30)
# alert when >= X matching events in an aggregation
self.walkAggregations(threshold=1)
# Set alert properties
def onAggregation(self, aggreg):
# aggreg['count']: number of items in the aggregation, ex: number of failed login attempts
# aggreg['value']: value of the aggregation field, ex: toto@example.com
# aggreg['events']: list of events in the aggregation
category = 'hostscanner'
tags = ['hostscanner']
severity = 'NOTICE'
summary = ('{0} host scanner findings on {1}'.format(aggreg['count'], aggreg['value']))
filenames = self.mostCommon(aggreg['allevents'],'_source.details.path')
for i in filenames[:5]:
summary += ' {0} ({1} hits)'.format(i[0], i[1])
# Create the alert object based on these properties
return self.createAlertDict(summary, category, tags, aggreg['events'], severity)

Просмотреть файл

@ -0,0 +1,2 @@
[options]
url = https://www.mozilla.org

Просмотреть файл

@ -0,0 +1,46 @@
#!/usr/bin/env python
# This Source Code Form is subject to the terms of the Mozilla Public
# License, v. 2.0. If a copy of the MPL was not distributed with this
# file, You can obtain one at http://mozilla.org/MPL/2.0/.
# Copyright (c) 2017 Mozilla Corporation
#
# Contributors:
# Michal Purzynski michal@mozilla.com
# Brandon Myers bmyers@mozilla.com
from lib.alerttask import AlertTask
from query_models import SearchQuery, TermMatch, ExistsMatch, PhraseMatch
class AlertHTTPBruteforce(AlertTask):
def main(self):
self.parse_config('httpauthbruteforce.conf', ['url'])
search_query = SearchQuery(minutes=15)
search_query.add_must([
TermMatch('_type', 'bro'),
TermMatch('eventsource', 'nsm'),
TermMatch('category', 'bronotice'),
ExistsMatch('details.sourceipaddress'),
PhraseMatch('details.note', 'AuthBruteforcing::HTTP_AuthBruteforcing_Attacker'),
])
self.filtersManual(search_query)
# Search events
self.searchEventsSimple()
self.walkEvents()
# Set alert properties
def onEvent(self, event):
category = 'httperrors'
tags = ['http']
severity = 'WARNING'
hostname = event['_source']['hostname']
url = self.config.url
# the summary of the alert is the same as the event
summary = '{0} {1}'.format(hostname, event['_source']['summary'])
# Create the alert object based on these properties
return self.createAlertDict(summary, category, tags, [event], severity=severity, url=url)

2
alerts/httperrors.conf Normal file
Просмотреть файл

@ -0,0 +1,2 @@
[options]
url = https://www.mozilla.org

Просмотреть файл

@ -3,27 +3,31 @@
# This Source Code Form is subject to the terms of the Mozilla Public
# License, v. 2.0. If a copy of the MPL was not distributed with this
# file, You can obtain one at http://mozilla.org/MPL/2.0/.
# Copyright (c) 2014 Mozilla Corporation
# Copyright (c) 2017 Mozilla Corporation
#
# Contributors:
# Michal Purzynski michal@mozilla.com
# Brandon Myers bmyers@mozilla.com
from lib.alerttask import AlertTask
import pyes
from query_models import SearchQuery, TermMatch, ExistsMatch, PhraseMatch
class AlertHTTPErrors(AlertTask):
def main(self):
# look for events in last 15 mins
date_timedelta = dict(minutes=15)
# Configure filters using pyes
must = [
pyes.TermFilter('_type', 'bro'),
pyes.TermFilter('eventsource', 'nsm'),
pyes.TermFilter('category', 'bronotice'),
pyes.ExistsFilter('details.sourceipaddress'),
pyes.QueryFilter(pyes.MatchQuery('details.note','MozillaHTTPErrors::Excessive_HTTP_Errors_Attacker','phrase')),
]
self.filtersManual(date_timedelta, must=must)
self.parse_config('httperrors.conf', ['url'])
search_query = SearchQuery(minutes=15)
search_query.add_must([
TermMatch('_type', 'bro'),
TermMatch('eventsource', 'nsm'),
TermMatch('category', 'bronotice'),
ExistsMatch('details.sourceipaddress'),
PhraseMatch('details.note', 'MozillaHTTPErrors::Excessive_HTTP_Errors_Attacker'),
])
self.filtersManual(search_query)
# Search events
self.searchEventsSimple()
@ -33,13 +37,12 @@ class AlertHTTPErrors(AlertTask):
def onEvent(self, event):
category = 'httperrors'
tags = ['http']
severity = 'NOTICE'
severity = 'WARNING'
hostname = event['_source']['hostname']
url = "https://mana.mozilla.org/wiki/display/SECURITY/NSM+IR+procedures"
url = self.config.url
# the summary of the alert is the same as the event
summary = '{0} {1}'.format(hostname, event['_source']['summary'])
# Create the alert object based on these properties
return self.createAlertDict(summary, category, tags, [event], severity=severity, url=url)

Просмотреть файл

@ -6,16 +6,22 @@
# Copyright (c) 2014 Mozilla Corporation
#
# Contributors:
# Anthony Verez averez@mozilla.com
# Jeff Bryner jbryner@mozilla.com
from lib.alerttask import AlertTask
from query_models import SearchQuery, TermMatch
class AlertFail2ban(AlertTask):
class ldapAdd(AlertTask):
def main(self):
# look for events in last 10 mins
date_timedelta = dict(minutes=10)
# Configure filters by importing a kibana dashboard
self.filtersFromKibanaDash('fail2ban_dashboard.json', date_timedelta)
search_query = SearchQuery(minutes=15)
search_query.add_must([
TermMatch('category', 'ldapChange'),
TermMatch('details.changetype', 'add')
])
self.filtersManual(search_query)
# Search events
self.searchEventsSimple()
@ -23,12 +29,10 @@ class AlertFail2ban(AlertTask):
# Set alert properties
def onEvent(self, event):
category = 'fail2ban'
tags = ['fail2ban']
severity = 'NOTICE'
# the summary of the alert is the same as the event
summary = event['_source']['summary']
category = 'ldap'
tags = ['ldap']
severity = 'INFO'
summary='{0} added {1}'.format(event['_source']['details']['actor'], event['_source']['details']['dn'])
# Create the alert object based on these properties
return self.createAlertDict(summary, category, tags, [event], severity)
return self.createAlertDict(summary, category, tags, [event], severity)

Просмотреть файл

@ -6,16 +6,22 @@
# Copyright (c) 2014 Mozilla Corporation
#
# Contributors:
# Anthony Verez averez@mozilla.com
# Jeff Bryner jbryner@mozilla.com
from lib.alerttask import AlertTask
from query_models import SearchQuery, TermMatch
class AlertBroNotice(AlertTask):
class ldapDelete(AlertTask):
def main(self):
# look for events in last 30 mins
date_timedelta = dict(minutes=30)
# Configure filters by importing a kibana dashboard
self.filtersFromKibanaDash('bro_notice_dashboard.json', date_timedelta)
search_query = SearchQuery(minutes=15)
search_query.add_must([
TermMatch('category', 'ldapChange'),
TermMatch('details.changetype', 'delete')
])
self.filtersManual(search_query)
# Search events
self.searchEventsSimple()
@ -23,12 +29,10 @@ class AlertBroNotice(AlertTask):
# Set alert properties
def onEvent(self, event):
category = 'bro'
tags = ['bro']
severity = 'NOTICE'
# the summary of the alert is the one of the event
summary = event['_source']['summary']
category = 'ldap'
tags = ['ldap']
severity = 'INFO'
summary='{0} deleted {1}'.format(event['_source']['details']['actor'], event['_source']['details']['dn'])
# Create the alert object based on these properties
return self.createAlertDict(summary, category, tags, [event], severity)
return self.createAlertDict(summary, category, tags, [event], severity)

Просмотреть файл

@ -9,20 +9,20 @@
# Jeff Bryner jbryner@mozilla.com
from lib.alerttask import AlertTask
import pyes
from query_models import SearchQuery, TermMatch, PhraseMatch
class ldapGroupModify(AlertTask):
def main(self):
# look for events in last x
date_timedelta = dict(minutes=15)
# Configure filters using pyes
must = [
pyes.TermFilter('category', 'ldapChange'),
pyes.TermFilter('changetype', 'modify'),
pyes.QueryFilter(pyes.MatchQuery("summary","groups"))
]
self.filtersManual(date_timedelta, must=must)
search_query = SearchQuery(minutes=15)
search_query.add_must([
TermMatch('category', 'ldapChange'),
TermMatch('details.changetype', 'modify'),
PhraseMatch("summary", "groups")
])
self.filtersManual(search_query)
# Search events
self.searchEventsSimple()
self.walkEvents()
@ -32,7 +32,7 @@ class ldapGroupModify(AlertTask):
category = 'ldap'
tags = ['ldap']
severity = 'INFO'
summary='{0}'.format(event['_source']['summary'])
summary = '{0}'.format(event['_source']['summary'])
# Create the alert object based on these properties
return self.createAlertDict(summary, category, tags, [event], severity)

Просмотреть файл

@ -9,20 +9,19 @@
# Jeff Bryner jbryner@mozilla.com
from lib.alerttask import AlertTask
import pyes
from query_models import SearchQuery, TermMatch, PhraseMatch
class ldapLockout(AlertTask):
def main(self):
# look for events in last x
date_timedelta = dict(minutes=15)
# Configure filters
# looking for pwdAccountLockedTime setting by admin
must = [
pyes.TermFilter('category', 'ldapChange'),
pyes.TermFilter("actor", "cn=admin,dc=mozilla"),
pyes.QueryFilter(pyes.MatchQuery('changepairs', 'replace:pwdAccountLockedTime','phrase'))
]
self.filtersManual(date_timedelta, must=must)
search_query = SearchQuery(minutes=15)
search_query.add_must([
TermMatch('category', 'ldapChange'),
TermMatch("details.actor", "cn=admin,dc=mozilla"),
PhraseMatch('details.changepairs', 'replace:pwdAccountLockedTime')
])
self.filtersManual(search_query)
# Search events
self.searchEventsSimple()

Просмотреть файл

@ -3,46 +3,30 @@
# This Source Code Form is subject to the terms of the Mozilla Public
# License, v. 2.0. If a copy of the MPL was not distributed with this
# file, You can obtain one at http://mozilla.org/MPL/2.0/.
# Copyright (c) 2014 Mozilla Corporation
# Copyright (c) 2017 Mozilla Corporation
#
# Contributors:
# Anthony Verez averez@mozilla.com
# Jeff Bryner jbryner@mozilla.com
# Brandon Myers bmyers@mozilla.com
import collections
import json
import kombu
import pytz
import pyes
import os
import sys
from configlib import getConfig, OptionParser
from datetime import datetime
from datetime import timedelta
from dateutil.parser import parse
from collections import Counter
from celery import Task
from celery.utils.log import get_task_logger
from config import RABBITMQ, ES, OPTIONS
from config import RABBITMQ, ES
def toUTC(suspectedDate, localTimeZone=None):
'''make a UTC date out of almost anything'''
utc = pytz.UTC
objDate = None
if localTimeZone is None:
localTimeZone= OPTIONS['defaulttimezone']
if type(suspectedDate) in (str, unicode):
objDate = parse(suspectedDate, fuzzy=True)
elif type(suspectedDate) == datetime:
objDate = suspectedDate
if objDate.tzinfo is None:
objDate = pytz.timezone(localTimeZone).localize(objDate)
objDate = utc.normalize(objDate)
else:
objDate = utc.normalize(objDate)
if objDate is not None:
objDate = utc.normalize(objDate)
return objDate
sys.path.append(os.path.join(os.path.dirname(__file__), "../../lib"))
from utilities.toUTC import toUTC
from elasticsearch_client import ElasticsearchClient
from query_models import ExistsMatch
# utility functions used by AlertTask.mostCommon
@ -82,11 +66,15 @@ def getValueByPath(input_dict, path_string):
class AlertTask(Task):
abstract = True
def __init__(self):
self.alert_name = self.__class__.__name__
self.filter = None
self.begindateUTC = None
self.enddateUTC = None
self.main_query = None
# Used to store any alerts that were thrown
self.alert_ids = []
# List of events
self.events = None
# List of aggregations
@ -100,10 +88,20 @@ class AlertTask(Task):
self._configureKombu()
self._configureES()
self.event_indices = ['events', 'events-previous']
@property
def log(self):
return get_task_logger('%s.%s' % (__name__, self.alert_name))
def parse_config(self, config_filename, config_keys):
myparser = OptionParser()
self.config = None
(self.config, args) = myparser.parse_args([])
for config_key in config_keys:
temp_value = getConfig(config_key, '', config_filename)
setattr(self.config, config_key, temp_value)
def _configureKombu(self):
"""
Configure kombu for rabbitmq
@ -121,27 +119,24 @@ class AlertTask(Task):
type='topic',
durable=True)
self.alertExchange(self.mqConn).declare()
alertQueue = kombu.Queue(RABBITMQ['alertqueue'],
exchange=self.alertExchange)
alertQueue = kombu.Queue(RABBITMQ['alertqueue'], exchange=self.alertExchange)
alertQueue(self.mqConn).declare()
self.mqproducer = self.mqConn.Producer(serializer='json')
self.log.debug('Kombu configured')
except Exception as e:
self.log.error('Exception while configuring kombu for alerts: {0}'.format(e))
def _configureES(self):
"""
Configure pyes for elasticsearch
Configure elasticsearch client
"""
try:
self.es = pyes.ES(ES['servers'])
self.es = ElasticsearchClient(ES['servers'])
self.log.debug('ES configured')
except Exception as e:
self.log.error('Exception while configuring ES for alerts: {0}'.format(e))
def mostCommon(self, listofdicts,dictkeypath):
def mostCommon(self, listofdicts, dictkeypath):
"""
Given a list containing dictionaries,
return the most common entries
@ -194,121 +189,54 @@ class AlertTask(Task):
Send alert to elasticsearch
"""
try:
res = self.es.index(index='alerts', doc_type='alert', doc=alertDict)
res = self.es.save_alert(body=alertDict)
self.log.debug('alert sent to ES')
self.log.debug(res)
return res
except Exception as e:
self.log.error('Exception while pushing alert to ES: {0}'.format(e))
def tagBotNotify(self, alert):
"""
Tag alert to be excluded based on severity
If 'ircchannel' is set in an alert, we automatically notify mozdefbot
"""
alert['notify_mozdefbot'] = True
if alert['severity'] == 'NOTICE' or alert['severity'] == 'INFO':
alert['notify_mozdefbot'] = False
def filtersManual(self, date_timedelta, must=[], should=[], must_not=[]):
# If an alert sets specific ircchannel, then we should probably always notify in mozdefbot
if 'ircchannel' in alert:
alert['notify_mozdefbot'] = True
return alert
def saveAlertID(self, saved_alert):
"""
Save alert to self so we can analyze it later
"""
self.alert_ids.append(saved_alert['_id'])
def filtersManual(self, query):
"""
Configure filters manually
date_timedelta is a dict in timedelta format
see https://docs.python.org/2/library/datetime.html#timedelta-objects
must, should and must_not are pyes filter objects lists
see http://pyes.readthedocs.org/en/latest/references/pyes.filters.html
query is a search query object with date_timedelta populated
"""
self.begindateUTC = toUTC(datetime.now() - timedelta(**date_timedelta))
self.enddateUTC = toUTC(datetime.now())
qDate = pyes.RangeQuery(qrange=pyes.ESRange('utctimestamp',
from_value=self.begindateUTC, to_value=self.enddateUTC))
q = pyes.ConstantScoreQuery(pyes.MatchAllQuery())
#Don't fire on already alerted events
if pyes.ExistsFilter('alerttimestamp') not in must_not:
must_not.append(pyes.ExistsFilter('alerttimestamp'))
must.append(qDate)
q.filters.append(pyes.BoolFilter(
must=must,
should=should,
must_not=must_not))
self.filter = q
def filtersFromKibanaDash(self, fp, date_timedelta):
"""
Import filters from a kibana dashboard
fp is the file path of the json file
date_timedelta is a dict in timedelta format
see https://docs.python.org/2/library/datetime.html#timedelta-objects
"""
f = open(fp)
data = json.load(f)
must = []
should = []
must_not = []
for filtid in data['services']['filter']['list'].keys():
filt = data['services']['filter']['list'][filtid]
if filt['active'] and 'query' in filt.keys():
value = filt['query'].split(':')[-1]
fieldname = filt['query'].split(':')[0].split('.')[-1]
# self.log.info(fieldname)
# self.log.info(value)
# field: fieldname
# query: value
if 'field' in filt.keys():
fieldname = filt['field']
value = filt['query']
if '\"' in value:
value = value.split('\"')[1]
pyesfilt = pyes.QueryFilter(pyes.MatchQuery(fieldname, value, 'phrase'))
else:
pyesfilt = pyes.TermFilter(fieldname, value)
else:
# _exists_:field
if filt['query'].startswith('_exists_:'):
pyesfilt = pyes.ExistsFilter(value.split('.')[-1])
# self.log.info('exists %s' % value.split('.')[-1])
# _missing_:field
elif filt['query'].startswith('_missing_:'):
pyesfilt = pyes.filters.MissingFilter(value.split('.')[-1])
# self.log.info('missing %s' % value.split('.')[-1])
# field:"value"
elif '\"' in value:
value = value.split('\"')[1]
pyesfilt = pyes.QueryFilter(pyes.MatchQuery(fieldname, value, 'phrase'))
# self.log.info("phrase %s %s" % (fieldname, value))
# field:(value1 value2 value3)
elif '(' in value and ')' in value:
value = value.split('(')[1]
value = value.split('(')[0]
pyesfilt = pyes.QueryFilter(pyes.MatchQuery(fieldname, value, "boolean"))
# field:value
else:
pyesfilt = pyes.TermFilter(fieldname, value)
# self.log.info("terms %s %s" % (fieldname, value))
if filt['mandate'] == 'must':
must.append(pyesfilt)
elif filt['mandate'] == 'either':
should.append(pyesfilt)
elif filt['mandate'] == 'mustNot':
must_not.append(pyesfilt)
# self.log.info(must)
f.close()
self.filtersManual(date_timedelta, must=must, should=should, must_not=must_not)
# Don't fire on already alerted events
if ExistsMatch('alerttimestamp') not in query.must_not:
query.add_must_not(ExistsMatch('alerttimestamp'))
self.main_query = query
def searchEventsSimple(self):
"""
Search events matching filters, store events in self.events
"""
try:
pyesresults = self.es.search(
self.filter,
size=1000,
indices='events,events-previous')
self.events = pyesresults._search_raw()['hits']['hits']
results = self.main_query.execute(self.es, indices=self.event_indices)
self.events = results['hits']
self.log.debug(self.events)
except Exception as e:
self.log.error('Error while searching events in ES: {0}'.format(e))
@ -323,18 +251,13 @@ class AlertTask(Task):
count: the hitcount of the text value
events: the sampled list of events that matched
allevents: the unsample, total list of matching events
aggregationPath can be key.subkey.subkey to specify a path to a dictionary value
relative to the _source that's returned from elastic search.
ex: details.sourceipaddress
"""
try:
pyesresults = self.es.search(
self.filter,
size=1000,
indices='events,events-previous')
results = pyesresults._search_raw()['hits']['hits']
esresults = self.main_query.execute(self.es, indices=self.event_indices)
results = esresults['hits']
# List of aggregation values that can be counted/summarized by Counter
# Example: ['evil@evil.com','haxoor@noob.com', 'evil@evil.com'] for an email aggregField
@ -376,41 +299,48 @@ class AlertTask(Task):
for i in self.events:
alert = self.onEvent(i, **kwargs)
if alert:
alert = self.tagBotNotify(alert)
self.log.debug(alert)
alertResultES = self.alertToES(alert)
self.tagEventsAlert([i], alertResultES)
self.alertToMessageQueue(alert)
self.hookAfterInsertion(alert)
self.saveAlertID(alertResultES)
# did we not match anything?
# can also be used as an alert trigger
if len(self.events) == 0:
alert = self.onNoEvent(**kwargs)
if alert:
alert = self.tagBotNotify(alert)
self.log.debug(alert)
alertResultES = self.alertToES(alert)
self.alertToMessageQueue(alert)
self.hookAfterInsertion(alert)
self.saveAlertID(alertResultES)
def walkAggregations(self, threshold):
def walkAggregations(self, threshold, config=None):
"""
Walk through aggregations, provide some methods to hook in alerts
"""
if len(self.aggregations) > 0:
for aggregation in self.aggregations:
if aggregation['count'] >= threshold:
aggregation['config']=config
alert = self.onAggregation(aggregation)
self.log.debug(alert)
if alert:
alert = self.tagBotNotify(alert)
self.log.debug(alert)
alertResultES = self.alertToES(alert)
# even though we only sample events in the alert
# tag all events as alerted to avoid re-alerting
# on events we've already processed.
self.tagEventsAlert(aggregation['allevents'], alertResultES)
self.alertToMessageQueue(alert)
self.saveAlertID(alertResultES)
def createAlertDict(self, summary, category, tags, events, severity='NOTICE', url=None):
def createAlertDict(self, summary, category, tags, events, severity='NOTICE', url=None, ircchannel=None):
"""
Create an alert dict
"""
@ -420,7 +350,8 @@ class AlertTask(Task):
'summary': summary,
'category': category,
'tags': tags,
'events': []
'events': [],
'ircchannel': ircchannel,
}
if url:
alert['url'] = url
@ -488,9 +419,7 @@ class AlertTask(Task):
'id': alertResultES['_id']})
event['_source']['alerttimestamp'] = toUTC(datetime.now()).isoformat()
self.es.update(event['_index'], event['_type'],
event['_id'], document=event['_source'])
self.es.save_event(index=event['_index'], doc_type=event['_type'], body=event['_source'], doc_id=event['_id'])
except Exception as e:
self.log.error('Error while updating events in ES: {0}'.format(e))
@ -511,3 +440,18 @@ class AlertTask(Task):
self.log.debug('finished')
except Exception as e:
self.log.error('Exception in main() method: {0}'.format(e))
def parse_json_alert_config(self, config_file):
"""
Helper function to parse an alert config file
"""
alert_dir = os.path.join(os.path.dirname(__file__), '..')
config_file_path = os.path.join(alert_dir, config_file)
json_obj = {}
with open(config_file_path, "r") as fd:
try:
json_obj = json.load(fd)
except ValueError:
sys.stderr.write("FAILED to open the configuration file\n")
return json_obj

Просмотреть файл

@ -7,35 +7,28 @@
#
# Contributors:
# Anthony Verez averez@mozilla.com
# Jeff Bryner jbryner@mozilla.com
from celery.schedules import crontab, timedelta
import time
import logging
ALERTS = {
'bro_intel.AlertBroIntel': {'schedule': crontab(minute='*/1')},
'bro_notice.AlertBroNotice': {'schedule': crontab(minute='*/1')},
'bruteforce_ssh.AlertBruteforceSsh': {'schedule': crontab(minute='*/1')},
'cloudtrail.AlertCloudtrail': {'schedule': crontab(minute='*/1')},
'fail2ban.AlertFail2ban': {'schedule': crontab(minute='*/1')},
ALERTS={
# 'pythonfile.pythonclass':{'schedule': crontab(minute='*/10')},
# 'pythonfile.pythonclass':{'schedule': timedelta(minutes=10),'kwargs':dict(hostlist=['nsm3', 'nsm5'])},
}
RABBITMQ = {
'mqserver': 'localhost',
'mquser': 'guest',
'mqpassword': 'guest',
'mqport': 5672,
'alertexchange': 'alerts',
'alertqueue': 'mozdef.alert'
'mqserver': 'localhost',
'mquser': 'mozdef',
'mqpassword': 'mozdef',
'mqport': 5672,
'alertexchange': 'alerts',
'alertqueue': 'mozdef.alert'
}
ES = {
'servers': ['http://localhost:9200']
}
OPTIONS = {
'defaulttimezone': 'UTC',
'servers': ['http://localhost:9200']
}
LOGGING = {
@ -67,7 +60,7 @@ LOGGING = {
'loggers': {
'celery': {
'handlers': ['celery', 'console'],
'level': 'DEBUG',
'level': 'INFO',
},
}
}

Просмотреть файл

@ -11,21 +11,22 @@
# Michal Purzynski <mpurzynski@mozilla.com>
from lib.alerttask import AlertTask
import pyes
from query_models import SearchQuery, TermMatch, ExistsMatch, TermsMatch
class AlertMultipleIntelHits(AlertTask):
def main(self):
# look for events in last X mins
date_timedelta = dict(minutes=2)
# Configure filters using pyes
must = [
pyes.TermFilter('_type', 'bro'),
pyes.TermFilter('eventsource', 'nsm'),
pyes.TermFilter('category', 'brointel'),
pyes.ExistsFilter('seenindicator'),
pyes.QueryFilter(pyes.MatchQuery('hostname', 'sensor1 sensor2 sensor3', 'boolean'))
]
self.filtersManual(date_timedelta, must=must)
search_query = SearchQuery(minutes=2)
search_query.add_must([
TermMatch('_type', 'bro'),
TermMatch('eventsource', 'nsm'),
TermMatch('category', 'brointel'),
ExistsMatch('details.seenindicator'),
TermsMatch('hostname', ['sensor1', 'sensor2', 'sensor3'])
])
self.filtersManual(search_query)
# Search aggregations on field 'seenindicator', keep X samples of events at most
self.searchEventsAggregated('details.seenindicator', samplesLimit=10)

Просмотреть файл

@ -0,0 +1,48 @@
#!/usr/bin/env python
# This Source Code Form is subject to the terms of the Mozilla Public
# License, v. 2.0. If a copy of the MPL was not distributed with this
# file, You can obtain one at http://mozilla.org/MPL/2.0/.
# Copyright (c) 2017 Mozilla Corporation
#
# Contributors:
# Jonathan Claudius jclaudius@mozilla.com
# Brandon Myers bmyers@mozilla.com
from lib.alerttask import AlertTask
from query_models import SearchQuery, TermMatch, PhraseMatch
class AlertOpenPortViolation(AlertTask):
def main(self):
search_query = SearchQuery(hours=4)
search_query.add_must([
TermMatch('_type', 'event'),
PhraseMatch('tags', 'open_port_policy_violation'),
])
self.filtersManual(search_query)
# Search aggregations on field 'sourceipaddress', keep X samples of
# events at most
self.searchEventsAggregated('details.destinationipaddress', samplesLimit=100)
# alert when >= X matching events in an aggregation
self.walkAggregations(threshold=1)
# Set alert properties
def onAggregation(self, aggreg):
# aggreg['count']: number of items in the aggregation, ex: number of failed login attempts
# aggreg['value']: value of the aggregation field, ex: toto@example.com
# aggreg['events']: list of events in the aggregation
category = 'open_port_policy_violation'
tags = ['open_port_policy_violation', 'openportpagerduty']
severity = 'CRITICAL'
summary = ('{0} unauthorized open port(s) on {1} ('.format(aggreg['count'], aggreg['value']))
for event in aggreg['events'][:5]:
summary += str(event['_source']['details']['destinationport']) + ' '
summary += ')'
# Create the alert object based on these properties
return self.createAlertDict(summary, category, tags, aggreg['events'], severity)

Просмотреть файл

@ -1,29 +0,0 @@
# This Source Code Form is subject to the terms of the Mozilla Public
# License, v. 2.0. If a copy of the MPL was not distributed with this
# file, You can obtain one at http://mozilla.org/MPL/2.0/.
# Copyright (c) 2014 Mozilla Corporation
#
# Contributors:
# Jeff Bryner jbryner@mozilla.com
class message(object):
def __init__(self):
'''
takes an incoming alert
and simply prints it
'''
self.registration = ['bro']
self.priority = 2
def onMessage(self, message):
# here is where you do something with the incoming alert message
if 'summary' in message.keys() :
print message['summary']
# you can modify the message if needed
# plugins registered with lower (>2) priority
# will receive the message and can also act on it
# but even if not modified, you must return it
return message

Просмотреть файл

@ -0,0 +1,5 @@
[options]
serviceKey = <yourapiservicekey>
docs = {"alertcategory1": "https://website.with.runbooks.com/alert-runbook"}
keywords = keyword1 keyword2
clienturl = https://a.link.to.your.siem.com/alert

Просмотреть файл

@ -20,9 +20,6 @@ class message(object):
and uses it to trigger an event using
the pager duty event api
'''
self.registration = ['bro']
self.priority = 2
# set my own conf file
# relative path to the rest index.py file
@ -31,43 +28,56 @@ class message(object):
if os.path.exists(self.configfile):
sys.stdout.write('found conf file {0}\n'.format(self.configfile))
self.initConfiguration()
self.registration = self.options.keywords.split(" ")
self.priority = 1
def initConfiguration(self):
myparser = OptionParser()
# setup self.options by sending empty list [] to parse_args
(self.options, args) = myparser.parse_args([])
# fill self.options with plugin-specific options
# change this to your default zone for when it's not specified
self.options.serviceKey = getConfig('serviceKey', 'APIKEYHERE', self.configfile)
self.options.keywords = getConfig('keywords', 'KEYWORDS', self.configfile)
self.options.clienturl = getConfig('clienturl', 'CLIENTURL', self.configfile)
try:
self.options.docs = json.loads(getConfig('docs', {}, self.configfile))
except:
self.options.docs = {}
def onMessage(self, message):
# here is where you do something with the incoming alert message
doclink = 'unknown'
if message['category'] in self.options.docs.keys():
doclink = self.options.docs[message['category']]
if 'summary' in message.keys() :
print message['summary']
headers = {
'Content-type': 'application/json',
}
payload = json.dumps({
"service_key": "{0}".format(self.options.serviceKey),
"incident_key": "bro",
"event_type": "trigger",
"description": "{0}".format(message['summary']),
"client": "mozdef",
"client_url": "http://mozdef.rocks",
"details": message['events']
"client": "MozDef",
"client_url": "https://" + self.options.clienturl + "/{0}".format(message['events'][0]['documentsource']['alerts'][0]['id']),
# "details": message['events'],
"contexts": [
{
"type": "link",
"href": "https://" + "{0}".format(doclink),
"text": "View runbook on mana"
}
]
})
r = requests.post(
'https://events.pagerduty.com/generic/2010-04-15/create_event.json',
headers=headers,
data=payload,
)
print r.status_code
print r.text
# you can modify the message if needed
# plugins registered with lower (>2) priority
# will receive the message and can also act on it
# but even if not modified, you must return it
return message
return message

Просмотреть файл

@ -0,0 +1,4 @@
[options]
smtpserver = <add_smtpserver>
sender = <add_sender_email>
recipients = <add_recipient_email>,<add_recipient2_email>

Просмотреть файл

@ -0,0 +1,73 @@
# This Source Code Form is subject to the terms of the Mozilla Public
# License, v. 2.0. If a copy of the MPL was not distributed with this
# file, You can obtain one at http://mozilla.org/MPL/2.0/.
# Copyright (c) 2017 Mozilla Corporation
#
# Contributors:
# Alicia Smith <asmith@mozilla.com>
# Michal Purzynski <mpurzynski@mozilla.com>
# Brandon Myers <bmyers@mozilla.com>
import os
import sys
from datetime import datetime
from configlib import getConfig, OptionParser
import smtplib
from email.mime.text import MIMEText
from email.Utils import formatdate
from time import mktime
class message(object):
def __init__(self):
'''
takes an incoming alert
and uses it to trigger an email sent to
the releng signing server team
'''
self.registration = ['access']
self.priority = 2
# set my own conf file
# relative path to the alerts alertWorker.py file
self.configfile = './plugins/ssh_access_signreleng.conf'
self.options = None
if os.path.exists(self.configfile):
sys.stdout.write('found conf file {0}\n'.format(self.configfile))
self.initConfiguration()
def initConfiguration(self):
myparser = OptionParser()
# setup self.options by sending empty list [] to parse_args
(self.options, args) = myparser.parse_args([])
# email settings
self.options.smtpserver = getConfig('smtpserver', 'localhost', self.configfile)
self.options.sender = getConfig('sender', 'donotreply@localhost.com', self.configfile)
recipients_str = getConfig('recipients', 'noone@localhost.com', self.configfile)
self.options.recipients = recipients_str.split(',')
def onMessage(self, message):
# here is where you do something with the incoming alert message
emailMessage = MIMEText(message['summary'] + ' on ' + message['events'][0]['documentsource']['utctimestamp'])
emailMessage['Subject'] = 'MozDef Alert: Releng Restricted Servers Successful SSH Access'
emailMessage['From'] = self.options.sender
emailMessage['To'] = ','.join(self.options.recipients)
nowtuple = mktime(datetime.utcnow().timetuple())
# The Date field needs to be in a specific format, and we must
# define it or gmail struggles to parse it.
emailMessage['Date'] = formatdate(nowtuple)
smtpObj = smtplib.SMTP(self.options.smtpserver, 25)
try:
smtpObj.sendmail(self.options.sender, self.options.recipients, emailMessage.as_string())
smtpObj.quit()
except smtplib.SMTPException as e:
sys.stderr.write('Error: failed to send email {0}\n'.format(e))
# you can modify the message if needed
# plugins registered with lower (>2) priority
# will receive the message and can also act on it
# but even if not modified, you must return it
return message

43
alerts/promisc_audit.py Normal file
Просмотреть файл

@ -0,0 +1,43 @@
#!/usr/bin/env python
# This Source Code Form is subject to the terms of the Mozilla Public
# License, v. 2.0. If a copy of the MPL was not distributed with this
# file, You can obtain one at http://mozilla.org/MPL/2.0/.
# Copyright (c) 2014 Mozilla Corporation
#
# Contributors:
# Michal Purzynski mpurzynski@mozilla.com
#
# This code alerts on every successfully opened session on any of the host from a given list
from lib.alerttask import AlertTask
from query_models import SearchQuery, TermMatch, QueryStringMatch, PhraseMatch
class PromiscAudit(AlertTask):
def main(self):
search_query = SearchQuery(minutes=2)
search_query.add_must([
TermMatch('_type', 'auditd'),
TermMatch('category', 'promiscuous'),
PhraseMatch('summary', 'promiscuous'),
PhraseMatch('summary', 'on'),
])
search_query.add_must_not([
QueryStringMatch('details.dev: veth*'),
])
self.filtersManual(search_query)
self.searchEventsSimple()
self.walkEvents()
def onEvent(self, event):
category = 'promisc'
severity = 'WARNING'
tags = ['promisc', 'audit']
summary = 'Promiscuous mode enabled on {0}'.format(event['_source']['hostname'])
return self.createAlertDict(summary, category, tags, [event], severity)

45
alerts/promisc_kernel.py Normal file
Просмотреть файл

@ -0,0 +1,45 @@
#!/usr/bin/env python
# This Source Code Form is subject to the terms of the Mozilla Public
# License, v. 2.0. If a copy of the MPL was not distributed with this
# file, You can obtain one at http://mozilla.org/MPL/2.0/.
# Copyright (c) 2014 Mozilla Corporation
#
# Contributors:
# Michal Purzynski mpurzynski@mozilla.com
#
# This code alerts on every successfully opened session on any of the host from a given list
from lib.alerttask import AlertTask
from query_models import SearchQuery, TermMatch, QueryStringMatch, PhraseMatch
class PromiscKernel(AlertTask):
def main(self):
search_query = SearchQuery(minutes=2)
search_query.add_must([
TermMatch('_type', 'event'),
TermMatch('category', 'syslog'),
PhraseMatch('summary', 'promiscuous'),
PhraseMatch('summary', 'entered'),
])
search_query.add_must_not([
QueryStringMatch('summary: veth*'),
])
self.filtersManual(search_query)
self.searchEventsAggregated('details.hostname', samplesLimit=10)
self.walkAggregations(threshold=1)
def onAggregation(self, aggreg):
category = 'promisc'
severity = 'WARNING'
tags = ['promisc', 'kernel']
summary = 'Promiscuous mode enabled on {1} [{0}]'.format(aggreg['count'], aggreg['value'])
return self.createAlertDict(summary, category, tags, aggreg['events'], severity)

42
alerts/proxy_drop.py Normal file
Просмотреть файл

@ -0,0 +1,42 @@
#!/usr/bin/env python
# This Source Code Form is subject to the terms of the Mozilla Public
# License, v. 2.0. If a copy of the MPL was not distributed with this
# file, You can obtain one at http://mozilla.org/MPL/2.0/.
# Copyright (c) 2014 Mozilla Corporation
#
# Contributors:
# Jonathan Claudius jclaudius@mozilla.com
# Brandon Myers bmyers@mozilla.com
# Alicia Smith asmith@mozilla.com
from lib.alerttask import AlertTask
from query_models import SearchQuery, TermMatch, ExistsMatch
class AlertProxyDrop(AlertTask):
def main(self):
search_query = SearchQuery(minutes=5)
search_query.add_must([
TermMatch('category', 'squid'),
ExistsMatch('details.proxyaction'),
])
self.filtersManual(search_query)
self.searchEventsSimple()
self.walkEvents()
#Set alert properties
def onEvent(self, event):
category = 'squid'
tags = ['squid']
severity = 'WARNING'
url = ""
summary = event['_source']['summary']
# Create the alert object based on these properties
return self.createAlertDict(summary, category, tags, [event], severity, url)

Просмотреть файл

@ -0,0 +1,2 @@
[options]
sqs_queues = queue1,queue2

Просмотреть файл

@ -0,0 +1,38 @@
#!/usr/bin/env python
# This Source Code Form is subject to the terms of the Mozilla Public
# License, v. 2.0. If a copy of the MPL was not distributed with this
# file, You can obtain one at http://mozilla.org/MPL/2.0/.
# Copyright (c) 2017 Mozilla Corporation
#
# Contributors:
# bmyers@mozilla.com
from lib.alerttask import AlertTask
from query_models import SearchQuery, TermMatch
class AlertSQSQueuesDeadman(AlertTask):
def main(self):
self.parse_config('sqs_queues_deadman.conf', ['sqs_queues'])
for queue_name in self.config.sqs_queues.split(','):
self.sqs_queue_name = queue_name
self.process_alert()
def process_alert(self):
search_query = SearchQuery(hours=1)
search_query.add_must(TermMatch('tags', self.sqs_queue_name))
self.filtersManual(search_query)
self.searchEventsSimple()
self.walkEvents()
def onNoEvent(self):
category = 'deadman'
tags = [self.sqs_queue_name, 'sqs']
severity = 'ERROR'
summary = 'No events found from {} sqs queue the last hour'.format(self.sqs_queue_name)
# Create the alert object based on these properties
return self.createAlertDict(summary, category, tags, [], severity=severity)

Просмотреть файл

@ -1,75 +0,0 @@
#!/usr/bin/env python
# This Source Code Form is subject to the terms of the Mozilla Public
# License, v. 2.0. If a copy of the MPL was not distributed with this
# file, You can obtain one at http://mozilla.org/MPL/2.0/.
# Copyright (c) 2014 Mozilla Corporation
#
# Contributors:
# Guillaume Destuynder gdestuynder@mozilla.com
# Sample msg
#{
# "_index": "events-20151022",
# "_type": "event",
# "_id": "Fr5Nitk4TXyvwULHF3rRrg",
# "_score": null,
# "_source": {
# "category": "syslog",
# "receivedtimestamp": "2015-10-22T18:29:40.695767+00:00",
# "utctimestamp": "2015-10-22T18:00:47+00:00",
# "tags": [
# "nubis_events_non_prod"
# ],
# "timestamp": "2015-10-22T18:00:47+00:00",
# "mozdefhostname": "mozdefqa1.private.scl3.mozilla.com",
# "summary": "1445536847.673 856 10.162.14.83 TCP_MISS/200 58510 CONNECT www.mozilla.org:443 - HIER_DIRECT/63.245.215.20 -",
# "source": "syslog",
# "details": {
# "__tag": "ec2.forward.squid.access",
# "region": "us-east-1",
# "instance_id": "i-1086c1c4",
# "instance_type": "m3.medium",
# "time": "2015-10-22T18:00:47Z",
# "message": "1445536847.673 856 10.162.14.83 TCP_MISS/200 58510 CONNECT www.mozilla.org:443 - HIER_DIRECT/63.245.215.20 -",
# "az": "us-east-1b"
# }
# },
# "sort": [
# 1445536847000
# ]
from lib.alerttask import AlertTask
import pyes
class AlertHTTPErrors(AlertTask):
def main(self):
# look for events in last 15 mins
date_timedelta = dict(minutes=15)
# Configure filters using pyes
must = [
pyes.TermFilter('tags', 'nubis_events_non_prod'),
pyes.TermFilter('tags', 'nubis_events_prod'),
pyes.TermFilter('category', 'syslog'),
pyes.TermFilter('details.__tag', 'ec2.forward.squid.access'),
pyes.QueryFilter(pyes.MatchQuery('details.summary','is DENIED, because it matched','phrase')),
]
self.filtersManual(date_timedelta, must=must)
# Search events
self.searchEventsSimple()
self.walkEvents()
# Set alert properties
def onEvent(self, event):
category = 'squiderrors'
tags = ['http', 'squid', 'proxy', 'nubis_events']
severity = 'NOTICE'
hostname = event['_source']['hostname']
url = "https://mana.mozilla.org/wiki/display/SECURITY/Notes%3A+Nubis+AWS"
# the summary of the alert is the same as the event
summary = '{0} {1}'.format(hostname, event['_source']['summary'])
# Create the alert object based on these properties
return self.createAlertDict(summary, category, tags, [event], severity=severity, url=url)

Просмотреть файл

@ -0,0 +1,4 @@
[options]
hostfilter = (host1|host2.*).*
users = someuser
ircchannel = #somechannel

Просмотреть файл

@ -0,0 +1,63 @@
#!/usr/bin/env python
# This Source Code Form is subject to the terms of the Mozilla Public
# License, v. 2.0. If a copy of the MPL was not distributed with this
# file, You can obtain one at http://mozilla.org/MPL/2.0/.
# Copyright (c) 2017 Mozilla Corporation
#
# Contributors:
# Aaron Meihm <ameihm@mozilla.com>
# Brandon Myers bmyers@mozilla.com
from lib.alerttask import AlertTask
from query_models import SearchQuery, TermMatch, PhraseMatch, QueryStringMatch
import re
class AlertAuthSignRelengSSH(AlertTask):
def main(self):
search_query = SearchQuery(minutes=15)
self.parse_config('ssh_access_signreleng.conf', ['hostfilter', 'users', 'ircchannel'])
if self.config.ircchannel == '':
self.config.ircchannel = None
search_query.add_must([
TermMatch('tags', 'releng'),
TermMatch('details.program', 'sshd'),
QueryStringMatch('details.hostname: /{}/'.format(self.config.hostfilter)),
PhraseMatch('summary', 'Accepted publickey for ')
])
for x in self.config.users.split():
search_query.add_must_not(PhraseMatch('summary', x))
self.filtersManual(search_query)
self.searchEventsSimple()
self.walkEvents()
# Set alert properties
def onEvent(self, event):
category = 'access'
tags = ['ssh']
severity = 'NOTICE'
targethost = 'unknown'
sourceipaddress = 'unknown'
x = event['_source']
if 'details' in x:
if 'hostname' in x['details']:
targethost = x['details']['hostname']
if 'sourceipaddress' in x['details']:
sourceipaddress = x['details']['sourceipaddress']
targetuser = 'unknown'
expr = re.compile('Accepted publickey for ([A-Za-z0-9]+) from')
m = expr.match(event['_source']['summary'])
groups = m.groups()
if len(groups) > 0:
targetuser = groups[0]
summary = 'SSH login from {0} on {1} as user {2}'.format(sourceipaddress, targethost, targetuser)
return self.createAlertDict(summary, category, tags, [event], severity, ircchannel=self.config.ircchannel)

27
alerts/ssh_lateral.json Normal file
Просмотреть файл

@ -0,0 +1,27 @@
{
"hostmustmatch": [
".*\\.enterprise\\.mozilla.com"
],
"hostmustnotmatch": [
"ten-forward\\.enterprise\\.mozilla\\.com"
],
"alertifsource": [
"10.0.0.0/8"
],
"notalertifsource": [
"10.0.22.0/24",
"10.0.23.0/24"
],
"ignoreusers": [
".*@\\S+.*"
],
"exceptions": [
["kirk",".*","10.1.1.1/32"],
["kirk",".*","10.1.1.2/32"],
["kirk",".*","10.1.1.3/32"],
["spock","sciencestation.enterprise.mozilla.com","10.0.50.0/24"],
["spock","sciencestation.enterprise.mozilla.com","10.0.51.0/24"],
["spock","sciencestation.enterprise.mozilla.com","10.0.52.0/24"],
["spock","sciencestation.enterprise.mozilla.com","10.0.53.0/24"]
]
}

167
alerts/ssh_lateral.py Normal file
Просмотреть файл

@ -0,0 +1,167 @@
#!/usr/bin/env python
# This Source Code Form is subject to the terms of the Mozilla Public
# License, v. 2.0. If a copy of the MPL was not distributed with this
# file, You can obtain one at http://mozilla.org/MPL/2.0/.
# Copyright (c) 2017 Mozilla Corporation
#
# Contributors:
# Aaron Meihm <ameihm@mozilla.com>
from lib.alerttask import AlertTask
from query_models import SearchQuery, TermMatch, QueryStringMatch, PhraseMatch
import json
import sys
import re
import netaddr
# This alert requires a configuration file, ssh_lateral.json to exist
# in the alerts directory.
#
# hostmustmatch:
# List of regex, if one matches the source of the syslog event (e.g. the
# hostname, it will be considered for an alert.
#
# hostmustnotmatch:
# Anything in this list will never be considered for an alert, even if it
# matches hostmustmatch.
#
# alertifsource:
# If a login occurs on a host matching something in hostmustmatch, and it
# is from an address in a CIDR block indicated in alertifsource, an alert
# will be generated.
#
# notalertifsource:
# Anything source IP matching a network in this list will not be alerted on,
# even if it matches alertifsource.
#
# ignoreusers:
# Never generate alerts for a login for a user matching these regex.
#
# exceptions:
# Exceptions to alerts. Each element in the list should be as follows:
# [ "user_regex", "host_regex", "cidr" ]
#
# If a login matches an exception rule, it will not be alerted on.
#
# Example:
#
# {
# "hostmustmatch": [
# ".*\\.enterprise\\.mozilla.com"
# ],
# "hostmustnotmatch": [
# "ten-forward\\.enterprise\\.mozilla\\.com"
# ],
# "alertifsource": [
# "10.0.0.0/8"
# ],
# "notalertifsource": [
# "10.0.22.0/24",
# "10.0.23.0/24"
# ],
# "ignoreusers": [
# ".*@\\S+.*"
# ],
# "exceptions": [
# ["kirk",".*","10.1.1.1/32"],
# ["kirk",".*","10.1.1.2/32"],
# ["kirk",".*","10.1.1.3/32"],
# ["spock","sciencestation.enterprise.mozilla.com","10.0.50.0/24"],
# ["spock","sciencestation.enterprise.mozilla.com","10.0.51.0/24"],
# ["spock","sciencestation.enterprise.mozilla.com","10.0.52.0/24"],
# ["spock","sciencestation.enterprise.mozilla.com","10.0.53.0/24"]
# ]
# }
class SshLateral(AlertTask):
def __init__(self):
AlertTask.__init__(self)
self._config = self.parse_json_alert_config('ssh_lateral.json')
def main(self):
search_query = SearchQuery(minutes=2)
search_query.add_must([
TermMatch('_type', 'event'),
TermMatch('category', 'syslog'),
TermMatch('details.program', 'sshd'),
PhraseMatch('summary', 'Accepted publickey'),
])
self.filtersManual(search_query)
self.searchEventsAggregated('details.hostname', samplesLimit=10)
self.walkAggregations(threshold=1)
# Returns true if the user, host, and source IP fall into an exception
# listed in the configuration file.
def exception_check(self, user, host, srcip):
for x in self._config['exceptions']:
if re.match(x[0], user) != None and \
re.match(x[1], host) != None and \
netaddr.IPAddress(srcip) in netaddr.IPNetwork(x[2]):
return True
return False
def onAggregation(self, aggreg):
category = 'session'
severity = 'WARNING'
tags = ['sshd', 'syslog']
# Determine if this source host is in scope, first match against
# hostmustmatch, and then negate matches using hostmustnotmatch
if len(aggreg['events']) == 0:
return None
srchost = aggreg['events'][0]['_source']['details']['hostname']
srcmatch = False
for x in self._config['hostmustmatch']:
if re.match(x, srchost) != None:
srcmatch = True
break
if not srcmatch:
return None
for x in self._config['hostmustnotmatch']:
if re.match(x, srchost) != None:
return None
# Determine if the origin of the connection was from a source outside
# of the exception policy, and in our address scope
candidates = []
sampleip = None
sampleuser = None
for x in aggreg['events']:
m = re.match('Accepted publickey for (\S+) from (\S+).*', x['_source']['summary'])
if m != None and len(m.groups()) == 2:
ipaddr = netaddr.IPAddress(m.group(2))
for y in self._config['alertifsource']:
if ipaddr in netaddr.IPNetwork(y):
# Validate it's not excepted in the IP negation list
notalertnetwork = False
for z in self._config['notalertifsource']:
if ipaddr in netaddr.IPNetwork(z):
notalertnetwork = True
break
if notalertnetwork:
continue
# Check our user ignore list
skipuser = False
for z in self._config['ignoreusers']:
if re.match(z, m.group(1)):
skipuser = True
break
if skipuser:
continue
# Check our exception list
if self.exception_check(m.group(1), srchost, m.group(2)):
continue
if sampleip == None:
sampleip = m.group(2)
if sampleuser == None:
sampleuser = m.group(1)
candidates.append(x)
if len(candidates) == 0:
return None
summary = 'SSH lateral movement outside policy: access to {} from {} as {}'.format(srchost, sampleip, sampleuser)
return self.createAlertDict(summary, category, tags, aggreg['events'], severity)

Просмотреть файл

@ -0,0 +1,2 @@
[options]
url = https://www.mozilla.org

Просмотреть файл

@ -3,27 +3,31 @@
# This Source Code Form is subject to the terms of the Mozilla Public
# License, v. 2.0. If a copy of the MPL was not distributed with this
# file, You can obtain one at http://mozilla.org/MPL/2.0/.
# Copyright (c) 2014 Mozilla Corporation
# Copyright (c) 2017 Mozilla Corporation
#
# Contributors:
# Michal Purzynski michal@mozilla.com
# Brandon Myers bmyers@mozilla.com
from lib.alerttask import AlertTask
import pyes
from query_models import SearchQuery, TermMatch, ExistsMatch, PhraseMatch
class AlertSSHManyConns(AlertTask):
def main(self):
# look for events in last 15 mins
date_timedelta = dict(minutes=15)
# Configure filters using pyes
must = [
pyes.TermFilter('_type', 'bro'),
pyes.TermFilter('eventsource', 'nsm'),
pyes.TermFilter('category', 'bronotice'),
pyes.ExistsFilter('details.sourceipaddress'),
pyes.QueryFilter(pyes.MatchQuery('details.note','SSH::Password_Guessing','phrase')),
]
self.filtersManual(date_timedelta, must=must)
self.parse_config('sshbruteforce_bro.conf', ['url'])
search_query = SearchQuery(minutes=15)
search_query.add_must([
TermMatch('_type', 'bro'),
TermMatch('eventsource', 'nsm'),
TermMatch('category', 'bronotice'),
ExistsMatch('details.sourceipaddress'),
PhraseMatch('details.note', 'SSH::Password_Guessing'),
])
self.filtersManual(search_query)
# Search events
self.searchEventsSimple()
@ -35,11 +39,10 @@ class AlertSSHManyConns(AlertTask):
tags = ['http']
severity = 'NOTICE'
hostname = event['_source']['hostname']
url = "https://mana.mozilla.org/wiki/display/SECURITY/NSM+IR+procedures"
url = self.config.url
# the summary of the alert is the same as the event
summary = '{0} {1}'.format(hostname, event['_source']['summary'])
# Create the alert object based on these properties
return self.createAlertDict(summary, category, tags, [event], severity=severity, url=url)

36
alerts/sshioc.py Normal file
Просмотреть файл

@ -0,0 +1,36 @@
#!/usr/bin/env python
# This Source Code Form is subject to the terms of the Mozilla Public
# License, v. 2.0. If a copy of the MPL was not distributed with this
# file, You can obtain one at http://mozilla.org/MPL/2.0/.
# Copyright (c) 2015 Mozilla Corporation
#
# Contributors:
# Aaron Meihm <ameihm@mozilla.com>
from lib.alerttask import AlertTask
from query_models import SearchQuery, TermMatch
class AlertSSHIOC(AlertTask):
def main(self):
search_query = SearchQuery(minutes=30)
search_query.add_must([
TermMatch('_type', 'event'),
TermMatch('tags', 'mig-runner-sshioc'),
])
self.filtersManual(search_query)
self.searchEventsSimple()
self.walkEvents()
# Set alert properties
def onEvent(self, event):
category = 'sshioc'
tags = ['sshioc']
severity = 'WARNING'
summary = 'SSH IOC match from runner plugin'
return self.createAlertDict(summary, category, tags, [event], severity)

Просмотреть файл

@ -9,21 +9,22 @@
# Michal Purzynski michal@mozilla.com
from lib.alerttask import AlertTask
import pyes
from query_models import SearchQuery, TermMatch, ExistsMatch
class AlertSSLBlacklistHit(AlertTask):
def main(self):
# look for events in last 15 mins
date_timedelta = dict(minutes=15)
# Configure filters using pyes
must = [
pyes.TermFilter('_type', 'bro'),
pyes.TermFilter('eventsource', 'nsm'),
pyes.TermFilter('category', 'brointel'),
pyes.TermFilter('details.sources', 'abuse.ch SSLBL'),
pyes.ExistsFilter('details.sourceipaddress')
]
self.filtersManual(date_timedelta, must=must)
search_query = SearchQuery(minutes=15)
search_query.add_must([
TermMatch('_type', 'bro'),
TermMatch('eventsource', 'nsm'),
TermMatch('category', 'brointel'),
TermMatch('details.sources', 'abuse.ch SSLBL'),
ExistsMatch('details.sourceipaddress')
])
self.filtersManual(search_query)
# Search events
self.searchEventsSimple()

Просмотреть файл

@ -1,25 +0,0 @@
[supervisord]
#Set true for debug
nodaemon=false
autostart=true
autorestart=true
logfile=/home/mozdef/envs/mozdef/logs/supervisord.log
childlogdir=/home/mozdef/envs/mozdef/logs/
pidfile=/home/mozdef/envs/mozdef/alerts/plugins/supervisord.pid
user=mozdef
[rpcinterface:supervisor]
supervisor.rpcinterface_factory = supervisor.rpcinterface:make_main_rpcinterface
[unix_http_server]
file=/home/mozdef/envs/mozdef/supervisorctl.sock
[supervisorctl]
serverurl=unix:///home/mozdef/envs/mozdef/supervisorctl.sock
[program:alerts]
priority=2
command=celery -A celeryconfig worker --loglevel=info --beat
user=mozdef
group=mozdef
directory=/home/mozdef/envs/mozdef/alerts

Просмотреть файл

@ -0,0 +1,28 @@
[supervisord]
#Set true for debug
nodaemon=false
autostart=true
autorestart=true
logfile=/var/log/mozdef/supervisord/supervisord.log
pidfile=/var/run/mozdef-alerts/supervisord.pid
user=mozdef
[rpcinterface:supervisor]
supervisor.rpcinterface_factory = supervisor.rpcinterface:make_main_rpcinterface
[unix_http_server]
file=/opt/mozdef/envs/mozdef/alerts/supervisorctl.sock
[supervisorctl]
serverurl=unix:///opt/mozdef/envs/mozdef/alerts/supervisorctl.sock
[program:alerts]
priority=2
command=celery -A celeryconfig worker --loglevel=info --beat
user=mozdef
group=mozdef
directory=/opt/mozdef/envs/mozdef/alerts
stdout_logfile=/var/log/mozdef/supervisord/alert_output.log
stdout_logfile_maxbytes=50MB
stderr_logfile=/var/log/mozdef/supervisord/alert_errors.log
stderr_logfile_maxbytes=50MB

Просмотреть файл

@ -0,0 +1,2 @@
[options]
url = https://www.mozilla.org

57
alerts/unauth_portscan.py Normal file
Просмотреть файл

@ -0,0 +1,57 @@
#!/usr/bin/env python
# This Source Code Form is subject to the terms of the Mozilla Public
# License, v. 2.0. If a copy of the MPL was not distributed with this
# file, You can obtain one at http://mozilla.org/MPL/2.0/.
# Copyright (c) 2017 Mozilla Corporation
#
# Contributors:
# Anthony Verez averez@mozilla.com
# Jeff Bryner jbryner@mozilla.com
# Aaron Meihm ameihm@mozilla.com
# Michal Purzynski <mpurzynski@mozilla.com>
# Alicia Smith <asmith@mozilla.com>
# Brandon Myers bmyers@mozilla.com
from lib.alerttask import AlertTask
from query_models import SearchQuery, TermMatch, ExistsMatch, PhraseMatch
class AlertUnauthPortScan(AlertTask):
def main(self):
self.parse_config('unauth_portscan.conf', ['url'])
search_query = SearchQuery(minutes=30)
search_query.add_must([
TermMatch('_type', 'bro'),
TermMatch('category', 'bronotice'),
TermMatch('eventsource', 'nsm'),
ExistsMatch('details.indicators'),
PhraseMatch('details.note', 'Scan::Port_Scan'),
])
self.filtersManual(search_query)
self.searchEventsSimple()
self.walkEvents()
# Set alert properties
def onEvent(self, event):
category = 'scan'
severity = 'NOTICE'
hostname = event['_source']['hostname']
url = self.config.url
indicators = 'unknown'
target = 'unknown'
x = event['_source']
if 'details' in x:
if 'indicators' in x['details']:
indicators = x['details']['indicators']
if 'destinationipaddress' in x['details']:
target = x['details']['destinationipaddress']
summary = '{2}: Unauthorized Port Scan Event from {0} scanning ports on host {1}'.format(indicators, target, hostname)
# Create the alert object based on these properties
return self.createAlertDict(summary, category, [], [event], severity, url)

3
alerts/unauth_scan.conf Normal file
Просмотреть файл

@ -0,0 +1,3 @@
[options]
url = https://www.mozilla.org
nsm_host = nsmservername

58
alerts/unauth_scan.py Normal file
Просмотреть файл

@ -0,0 +1,58 @@
#!/usr/bin/env python
# This Source Code Form is subject to the terms of the Mozilla Public
# License, v. 2.0. If a copy of the MPL was not distributed with this
# file, You can obtain one at http://mozilla.org/MPL/2.0/.
# Copyright (c) 2017 Mozilla Corporation
#
# Contributors:
# Anthony Verez averez@mozilla.com
# Jeff Bryner jbryner@mozilla.com
# Aaron Meihm ameihm@mozilla.com
# Michal Purzynski <mpurzynski@mozilla.com>
# Alicia Smith <asmith@mozilla.com>
# Brandon Myers bmyers@mozilla.com
from lib.alerttask import AlertTask
from query_models import SearchQuery, TermMatch, ExistsMatch, PhraseMatch
class AlertUnauthInternalScan(AlertTask):
def main(self):
self.parse_config('unauth_scan.conf', ['nsm_host', 'url'])
search_query = SearchQuery(minutes=2)
search_query.add_must([
TermMatch('_type', 'bro'),
TermMatch('category', 'bronotice'),
TermMatch('eventsource', 'nsm'),
TermMatch('hostname', self.config.nsm_host),
ExistsMatch('details.indicators'),
PhraseMatch('details.note', 'Scan::Address_Scan'),
])
self.filtersManual(search_query)
self.searchEventsSimple()
self.walkEvents()
# Set alert properties
def onEvent(self, event):
category = 'scan'
severity = 'NOTICE'
hostname = event['_source']['hostname']
url = self.config.url
indicators = 'unknown'
port = 'unknown'
x = event['_source']
if 'details' in x:
if 'indicators' in x['details']:
indicators = x['details']['indicators']
if 'p' in x['details']:
port = x['details']['p']
summary = '{2}: Unauthorized Internal Scan Event from {0} scanning ports {1}'.format(indicators, port, hostname)
# Create the alert object based on these properties
return self.createAlertDict(summary, category, [], [event], severity, url)

4
alerts/unauth_ssh.conf Normal file
Просмотреть файл

@ -0,0 +1,4 @@
[options]
hostfilter = (host1|host2.*).*
users = someuser
skiphosts = somehost anotherhost

Просмотреть файл

@ -9,39 +9,40 @@
# Aaron Meihm <ameihm@mozilla.com>
from lib.alerttask import AlertTask
import pyes
import json
from query_models import SearchQuery, TermMatch, QueryStringMatch, PhraseMatch
import re
from configlib import getConfig, OptionParser
# Note: this plugin requires a configuration file (unauth_ssh_pyes.conf)
# Note: this plugin requires a configuration file (unauth_ssh.conf)
# to exist in the same directory as the plugin.
#
# It should contain content such as:
# [options]
# hostfilter <ES compatible regexp>
# user username
# skiphosts 1.2.3.4 2.3.4.5
# skiphosts 1.2.3.4 2.3.4.5
class AlertUnauthSSH(AlertTask):
def main(self):
date_timedelta = dict(minutes=30)
self.config_file = './unauth_ssh_pyes.conf'
self.config_file = './unauth_ssh.conf'
self.config = None
self.initConfiguration()
must = [
pyes.TermFilter('_type', 'event'),
pyes.TermFilter('category', 'syslog'),
pyes.TermFilter('details.program', 'sshd'),
pyes.QueryFilter(pyes.QueryStringQuery('details.hostname: /{}/'.format(self.config.hostfilter))),
pyes.QueryFilter(pyes.MatchQuery('summary', 'Accepted publickey {}'.format(self.config.user), operator='and'))
]
must_not = []
search_query = SearchQuery(minutes=30)
search_query.add_must([
TermMatch('_type', 'event'),
TermMatch('category', 'syslog'),
TermMatch('details.program', 'sshd'),
QueryStringMatch('details.hostname: /{}/'.format(self.config.hostfilter)),
PhraseMatch('summary', 'Accepted publickey for {}'.format(self.config.user))
])
for x in self.config.skiphosts:
must_not.append(pyes.QueryFilter(pyes.MatchQuery('summary', x)))
self.filtersManual(date_timedelta, must=must, must_not=must_not)
search_query.add_must_not(PhraseMatch('summary', x))
self.filtersManual(search_query)
self.searchEventsSimple()
self.walkEvents()

Просмотреть файл

@ -1,4 +0,0 @@
[options]
hostfilter = regexp
user = username
skiphosts = 127.0.0.1 127.0.0.2

Просмотреть файл

@ -1,21 +0,0 @@
[uwsgi]
chdir = /home/mozdef/envs/mozdef/alerts
uid = mozdef
mule = alertWorker.py
mule = alertWorker.py
mule = alertWorker.py
mule = alertWorker.py
pyargv = -c /home/mozdef/envs/mozdef/alerts/alertWorker.conf
py-auto-reload=30s
;stats = 127.0.0.1:9192
;py-auto-reload=30s
daemonize = /home/mozdef/envs/mozdef/logs/uwsgi.AlertPluginsMules.log
;ignore normal operations that generate nothing but normal response
log-drain = generated 0 bytes
log-date = %%a %%b %%d %%H:%%M:%%S
socket = /home/mozdef/envs/mozdef/alerts/AlertPluginsMules.socket
virtualenv = /home/mozdef/envs/mozdef/
master-fifo = /home/mozdef/envs/mozdef/alerts/AlertPluginsMules.fifo
never-swap
pidfile= /home/mozdef/envs/mozdef/alerts/AlertPluginsMules.pid
vacuum = true

Просмотреть файл

@ -0,0 +1,50 @@
#!/usr/bin/env python
# This Source Code Form is subject to the terms of the Mozilla Public
# License, v. 2.0. If a copy of the MPL was not distributed with this
# file, You can obtain one at http://mozilla.org/MPL/2.0/.
# Copyright (c) 2014 Mozilla Corporation
#
# Contributors:
# Anthony Verez averez@mozilla.com
# Jeff Bryner jbryner@mozilla.com
from lib.alerttask import AlertTask
from query_models import SearchQuery, TermMatch, PhraseMatch
class AlertManyVPNDuoAuthFailures(AlertTask):
def main(self):
search_query = SearchQuery(minutes=2)
search_query.add_must([
TermMatch('_type', 'event'),
TermMatch('category', 'event'),
TermMatch('tags', 'duosecurity'),
PhraseMatch('details.integration', 'global and external openvpn'),
PhraseMatch('details.result', 'FAILURE'),
])
self.filtersManual(search_query)
# Search aggregations on field 'username', keep X samples of events at most
self.searchEventsAggregated('details.username', samplesLimit=5)
# alert when >= X matching events in an aggregation
self.walkAggregations(threshold=5)
# Set alert properties
def onAggregation(self, aggreg):
# aggreg['count']: number of items in the aggregation, ex: number of failed login attempts
# aggreg['value']: value of the aggregation field, ex: toto@example.com
# aggreg['events']: list of events in the aggregation
category = 'openvpn'
tags = ['vpn', 'auth', 'duo']
severity = 'NOTICE'
summary = ('{0} openvpn authentication attempts by {1}'.format(aggreg['count'], aggreg['value']))
sourceip = self.mostCommon(aggreg['allevents'], '_source.details.ip')
for i in sourceip[:5]:
summary += ' {0} ({1} hits)'.format(i[0], i[1])
# Create the alert object based on these properties
return self.createAlertDict(summary, category, tags, aggreg['events'], severity)

Просмотреть файл

@ -78,7 +78,7 @@ if __name__ == '__main__':
for i in range(0,10):
print(i)
alog=dict(eventtime=pytz.timezone('US/Pacific').localize(datetime.now()).isoformat(),\
alog=dict(eventtime=pytz.timezone('UTC').localize(datetime.now()).isoformat(),\
hostname=socket.gethostname(),\
processid=os.getpid(),\
processname=sys.argv[0],\

Просмотреть файл

@ -1 +0,0 @@
replace me with valid GeoLiteCity.dat file

27
bot/README.md Normal file
Просмотреть файл

@ -0,0 +1,27 @@
KitnIRC - A Python IRC Bot Framework
====================================
KitnIRC is an IRC framework that attempts to handle most of the
monotony of writing IRC bots without sacrificing flexibility.
Usage
-----
See the `skeleton` directory in the root level for a starting code skeleton
you can copy into a new project's directory and build off of, and
[Getting Started](https://github.com/ayust/kitnirc/wiki/Getting-Started)
for introductory documentation.
License
-------
KitnIRC is licensed under the MIT License (see `LICENSE` for details).
Other Resources
---------------
Useful reference documents for those working with the IRC protocol as a client:
* [RFC 2812](http://tools.ietf.org/html/rfc2812)
* [ISUPPORT draft](http://tools.ietf.org/html/draft-brocklesby-irc-isupport-03)
* [List of numeric replies](https://www.alien.net.au/irc/irc2numerics.html)

@ -1 +0,0 @@
Subproject commit d8bf81a2ae658ab23d35493e346f0b27eb889f71

Просмотреть файл

@ -71,7 +71,8 @@ class Zilla(Module):
except:
return
for bug in res['bugs']:
self.controller.client.msg(self.channel, "\x037\x02WARNING\x03\x02 \x032\x02NEW\x03\x02 bug: {url}{bugid} {summary}".format(summary=bug['summary'],
bugsummary = bug['summary'].encode('utf-8', 'replace')
self.controller.client.msg(self.channel, "\x037\x02WARNING\x03\x02 \x032\x02NEW\x03\x02 bug: {url}{bugid} {summary}".format(summary=bugsummary,
url=self.url, bugid=bug['id']))
def start(self, *args, **kwargs):

23
bot/mozdefbot.conf Normal file
Просмотреть файл

@ -0,0 +1,23 @@
[options]
host = <add_irc_hostname>
password = <add_irc_password>
nick = mozdef
username = mozdef
realname = mozdef
join = #somechannel
mqalertserver = localhost
mquser=mozdef
mqpassword=mozdef
channelkeys={"#somechannel": "somepassword"}
[modules]
modules.roulette = 5
modules.zilla = 5
[zilla]
url = https://bugzilla.mozilla.org/
api_key = <add_api_key>
interval= 300
channel = #somechannel
search_terms = [{"product": "<add_product_name>"}, {"status": "NEW"}, {"status": "UNCONFIRMED"}]

16
bot/mozdefbot.ini Normal file
Просмотреть файл

@ -0,0 +1,16 @@
[uwsgi]
chdir = /opt/mozdef/envs/mozdef/bot/
uid = mozdef
mule = mozdefbot.py
pyargv = -c /opt/mozdef/envs/mozdef/bot/mozdefbot.conf
log-syslog = mozdefbot-worker
log-drain = generated 0 bytes
socket = /opt/mozdef/envs/mozdef/bot/mozdefbot.socket
virtualenv = /opt/mozdef/envs/mozdef/
procname-master = [m]
procname-prefix = [mozdefbot]
master-fifo = /opt/mozdef/envs/mozdef/bot/mozdefbot.fifo
never-swap
pidfile = /var/run/mozdefbot/mozdefbot.pid
vacuum = true
enable-threads

Просмотреть файл

@ -28,6 +28,11 @@ from dateutil.parser import parse
from kombu import Connection, Queue, Exchange
from kombu.mixins import ConsumerMixin
import sys
import os
sys.path.append(os.path.join(os.path.dirname(os.path.abspath(__file__)), '../lib'))
from utilities.toUTC import toUTC
logger = logging.getLogger()
formatter = logging.Formatter('%(asctime)s - %(name)s - %(levelname)s - %(message)s')
@ -104,28 +109,6 @@ def run_async(func):
return async_func
def toUTC(suspectedDate, localTimeZone=None):
'''make a UTC date out of almost anything'''
utc = pytz.UTC
objDate = None
if localTimeZone is None:
localTimeZone = options.defaultTimeZone
if type(suspectedDate) == str:
objDate = parse(suspectedDate, fuzzy=True)
elif type(suspectedDate) == datetime:
objDate = suspectedDate
if objDate.tzinfo is None:
objDate = pytz.timezone(localTimeZone).localize(objDate)
objDate = utc.normalize(objDate)
else:
objDate = utc.normalize(objDate)
if objDate is not None:
objDate = utc.normalize(objDate)
return objDate
def getQuote():
aquote = '{0} --Mos Def'.format(
quotes[random.randint(0, len(quotes) - 1)].strip())
@ -143,7 +126,8 @@ def isIP(ip):
def ipLocation(ip):
location = ""
try:
gi = pygeoip.GeoIP('GeoLiteCity.dat', pygeoip.MEMORY_CACHE)
geoip_location = os.path.join(os.path.dirname(os.path.abspath(__file__)), "../lib/GeoLiteCity.dat")
gi = pygeoip.GeoIP(geoip_location, pygeoip.MEMORY_CACHE)
geoDict = gi.record_by_addr(str(netaddr.IPNetwork(ip)[0]))
if geoDict is not None:
location = geoDict['country_name']
@ -273,6 +257,7 @@ class mozdefBot():
pass
except Exception as e:
sys.stdout.write('stdout - bot error, quitting {0}'.format(e))
self.client.root_logger.error('bot error..quitting {0}'.format(e))
self.client.disconnect()
if self.mqConsumer:
@ -323,6 +308,12 @@ class alertConsumer(ConsumerMixin):
logger.exception(
"alertworker exception: unknown body type received %r" % body)
return
if 'notify_mozdefbot' in bodyDict and bodyDict['notify_mozdefbot'] is False:
# If the alert tells us to not notify, then don't post to IRC
message.ack()
return
# process valid message
# see where we send this alert
ircchannel = options.alertircchannel
@ -342,6 +333,7 @@ class alertConsumer(ConsumerMixin):
if len(bodyDict['summary']) > 450:
sys.stdout.write('alert is more than 450 bytes, truncating\n')
bodyDict['summary'] = bodyDict['summary'][:450] + ' truncated...'
self.ircBot.client.msg(ircchannel, formatAlert(bodyDict))
message.ack()
@ -382,13 +374,7 @@ def consumeAlerts(ircBot):
def initConfig():
# initialize config options
# sets defaults or overrides from config file.
# change this to your default zone for when it's not specified
# in time strings
options.defaultTimeZone = getConfig('defaulttimezone',
'US/Pacific',
options.configfile)
# irc options
options.host = getConfig('host', 'irc.somewhere.com', options.configfile)
options.nick = getConfig('nick', 'mozdefnick', options.configfile)
@ -447,7 +433,7 @@ if __name__ == "__main__":
sh = logging.StreamHandler(sys.stderr)
sh.setFormatter(formatter)
logger.addHandler(sh)
parser = OptionParser()
parser.add_option(
"-c", dest='configfile',

Просмотреть файл

@ -1,8 +0,0 @@
[options]
host = irc.hostname.org
password = botpasswordgoeshere
nick = botnickgoeshere
join = #channelnamegoeshere
mqalertserver = messagequeueservernamegoeshere
alertexchange = alerts
channelkeys={"#somechannel": "somekey"}

Просмотреть файл

@ -0,0 +1,24 @@
if $programname == 'mozdefbot-worker' then /var/log/mozdef/mozdefbot.log
if $programname == 'loginput-worker' then /var/log/mozdef/loginput.log
if $programname == 'infosecsqs-worker' then /var/log/mozdef/infosecsqs.log
if $programname == 'restapi-worker' then /var/log/mozdef/restapi.log
if $programname == 'syslog-worker' then /var/log/mozdef/syslog.log
if $programname == 'nubis-worker' then /var/log/mozdef/nubis.log
if $programname == 'bro-worker' then /var/log/mozdef/bro.log
if $programname == 'migsqs-worker' then /var/log/mozdef/migsqs.log
if $programname == 'parsyssqs-worker' then /var/log/mozdef/parsyssqs.log
if $programname == 'autoland-worker' then /var/log/mozdef/autoland.log
if $programname == 'contegix-worker' then /var/log/mozdef/contegix.log
if $programname == 'deis-worker' then /var/log/mozdef/deis.log
if $programname == 'releng-worker' then /var/log/mozdef/releng.log
if $programname == 'fxa-worker' then /var/log/mozdef/fxa.log
if $programname == 'httpobs-worker' then /var/log/mozdef/httpobs.log
if $programname == 'riskheatmap-worker' then /var/log/mozdef/riskheatmap.log
if $programname == 'ssosqs-worker' then /var/log/mozdef/sso.log
if $programname == 'cloudtrail-worker' then /var/log/mozdef/cloudtrail.log
if $programname == 'alertplugins-worker' then /var/log/mozdef/alertplugins.log
if $programname == 'contegix-auditd-worker' then /var/log/mozdef/contegix-auditd.log
if $programname == 'mongod.3002' then /var/log/mozdef/mongo/meteor-mongo.log
if $programname == 'mongod' then /var/log/mozdef/mongo/mongo.log
if $programname == 'kibana4' then /var/log/mozdef/kibana.log
& stop

13
config/logrotate-mongod Normal file
Просмотреть файл

@ -0,0 +1,13 @@
/var/log/mozdef/mongo/*.log
{
rotate 17
weekly
missingok
notifempty
compress
delaycompress
sharedscripts
postrotate
service rsyslog reload-or-restart > /dev/null
endscript
}

13
config/logrotate-mozdef Normal file
Просмотреть файл

@ -0,0 +1,13 @@
/var/log/mozdef/*.log
{
rotate 17
weekly
missingok
notifempty
compress
delaycompress
sharedscripts
postrotate
service rsyslog reload-or-restart > /dev/null
endscript
}

15
config/logrotate-nginx Normal file
Просмотреть файл

@ -0,0 +1,15 @@
/var/log/mozdef/nginx/*.log {
weekly
missingok
rotate 17
compress
delaycompress
notifempty
create 644 mozdef mozdef
sharedscripts
postrotate
if [ -f /var/run/nginx.pid ]; then
kill -USR1 `cat /var/run/nginx.pid`
fi
endscript
}

Просмотреть файл

@ -0,0 +1,11 @@
/var/log/mozdef/supervisord/*.log
{
copytruncate
dateext
rotate 17
weekly
missingok
notifempty
compress
delaycompress
}

Просмотреть файл

@ -4,10 +4,10 @@
# http://docs.mongodb.org/manual/reference/configuration-options/
# where to write logging data.
systemLog:
destination: file
logAppend: true
path: /opt/mozdef/envs/mozdef/logs/meteor-mongo.log
#systemLog:
# destination: file
# logAppend: true
# path: /var/log/mozdef/mongo/meteor-mongo.log
# Where and how to store data.
storage:

169
config/nginx.conf Normal file
Просмотреть файл

@ -0,0 +1,169 @@
user mozdef mozdef;
worker_processes 5;
error_log /var/log/mozdef/nginx/error_log notice;
events {
worker_connections 1024;
use epoll;
}
http {
include /etc/nginx/mime.types;
default_type application/octet-stream;
log_format main
'$remote_addr - $remote_user [$time_local] '
'"$request" $status $bytes_sent '
'"$http_referer" "$http_user_agent" '
'"$gzip_ratio"';
client_header_timeout 10m;
client_max_body_size 4m;
client_body_timeout 10m;
send_timeout 10m;
connection_pool_size 256;
client_header_buffer_size 1k;
client_body_buffer_size 1024k;
large_client_header_buffers 4 2k;
request_pool_size 4k;
gzip on;
gzip_min_length 1100;
gzip_buffers 4 8k;
gzip_types text/plain;
output_buffers 1 32k;
postpone_output 1460;
sendfile on;
tcp_nopush on;
tcp_nodelay on;
keepalive_disable none;
keepalive_requests 2147483647;
keepalive_timeout 750s;
ignore_invalid_headers on;
index index.html;
ssl_session_cache shared:SSL:10m;
ssl_session_timeout 10m;
ssl_certificate /etc/ssl/certs/server.crt;
ssl_certificate_key /etc/ssl/certs/server.key;
proxy_temp_path /tmp/proxy 1 2;
server {
access_log /dev/null main;
error_log /var/log/mozdef/nginx/nginx.error_log notice;
listen 8080;
listen 8443 ssl;
charset utf-8;
root /opt/mozdef/envs/mozdef/loginput;
location / {
include /opt/mozdef/envs/mozdef/bin/uwsgi_params;
uwsgi_pass unix:/opt/mozdef/envs/mozdef/loginput/loginput.socket;
uwsgi_param UWSGI_PYHOME /opt/mozdef/envs/mozdef/;
uwsgi_param UWSGI_CHIDIR /opt/mozdef/envs/mozdef/loginput;
uwsgi_param UWSGI_SCRIPT index; # this should be the .py file name without suffix that your bottle will use to launch
}
location ~static{
root /opt/mozdef/envs/mozdef;
}
}
## rest service on 8444 ssl ###
server {
access_log /dev/null main;
error_log /var/log/mozdef/nginx/nginx.rest.error_log notice;
listen 8444 ssl;
charset utf-8;
root /opt/mozdef/envs/mozdef/rest;
location / {
include /opt/mozdef/envs/mozdef/bin/uwsgi_params;
uwsgi_pass unix:/opt/mozdef/envs/mozdef/rest/restapi.socket;
uwsgi_param UWSGI_PYHOME /opt/mozdef/envs/mozdef/;
uwsgi_param UWSGI_CHIDIR /opt/mozdef/envs/mozdef/rest;
uwsgi_param UWSGI_SCRIPT index;
}
}
##ssl version of elastic search##
server{
listen *:9000 ssl;
access_log /dev/null main;
location /{
proxy_pass http://127.0.0.1:9200;
proxy_read_timeout 90;
}
}
##ssl proxy for meteor##
server{
listen *:443 ssl;
access_log /dev/null main;
location /{
proxy_pass http://127.0.0.1:3000;
proxy_read_timeout 90;
}
}
##kibana 9090, plain text, 9443 ssl##
server {
listen *:9090 ;
listen *:9443 ssl;
server_name localhost;
access_log /dev/null main;
error_log /var/log/mozdef/nginx/nginx.kibana.error_log notice;
location ~ (/app/kibana|/app/sense|/bundles/|/kibana4|/status|/plugins|/elasticsearch|/api/sense/proxy|/kibana|/kibana/goto) {
proxy_http_version 1.1;
proxy_pass http://127.0.0.1:5601;
proxy_read_timeout 90;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection 'upgrade';
proxy_set_header Host $http_host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
proxy_cache_bypass $http_upgrade;
rewrite /kibana/(.*)$ /$1 break;
proxy_redirect http://127.0.0.1:5601 https://localhost:9443;
}
location ~ ^/_aliases$ {
proxy_pass http://127.0.0.1:9200;
proxy_read_timeout 90;
}
location ~ ^/_nodes$ {
proxy_pass http://127.0.0.1:9200;
proxy_read_timeout 90;
}
location ~ ^/.*/_search$ {
proxy_pass http://127.0.0.1:9200;
proxy_read_timeout 90;
}
location ~ ^/.*/_mapping$ {
proxy_pass http://127.0.0.1:9200;
proxy_read_timeout 90;
}
# Password protected end points
location ~ ^/.kibana/dashboard/.*$ {
proxy_pass http://127.0.0.1:9200;
proxy_read_timeout 90;
}
location ~ ^/.kibana/temp.*$ {
proxy_pass http://127.0.0.1:9200;
proxy_read_timeout 90;
}
}
}

Просмотреть файл

@ -50,4 +50,4 @@
}
}
}
}
}

Некоторые файлы не были показаны из-за слишком большого количества измененных файлов Показать больше