test(spec): Control Server API secured via cross-pinning mTLS

. We are now leveraging nix for portable dependency handling.
. There are now three types of tests: client, server and end-to-end.
. server tests exercise the actual kapow server being tested, while the
requests are performed using the test steps.
. client tests exercise the actual kapow client being tested, while the
requests are served using the test steps.
. e2e test exercise the actual kapow program in its dual role of client
and server (¡como tiene que ser!).

Co-authored-by: Roberto Abdelkader Martínez Pérez <robertomartinezp@gmail.com>
This commit is contained in:
pancho horrillo
2021-03-12 17:02:42 +01:00
parent b7b55d2f3b
commit ab50721f69
17 changed files with 1587 additions and 92 deletions
+1 -1
View File
@@ -40,7 +40,7 @@ jobs:
docker build . -t bbvalabsci/kapow-spec-test-suite:latest docker build . -t bbvalabsci/kapow-spec-test-suite:latest
- name: Spec test - name: Spec test
run: | run: |
docker run --mount type=bind,source=$(pwd)/build/kapow,target=/usr/local/bin/kapow bbvalabsci/kapow-spec-test-suite:latest behave --tags=~@skip docker run --mount type=bind,source=$(pwd)/build/kapow,target=/usr/bin/kapow bbvalabsci/kapow-spec-test-suite:latest "behave --tags=~@skip"
doc-test: doc-test:
runs-on: ubuntu-20.04 runs-on: ubuntu-20.04
steps: steps:
+4
View File
@@ -6,3 +6,7 @@ build
docs/build docs/build
docs/Pipfile.lock docs/Pipfile.lock
node_modules
*.swp
+2 -2
View File
@@ -42,8 +42,8 @@ coverage: test race
install: build install: build
CGO_ENABLED=0 $(GOINSTALL) ./... CGO_ENABLED=0 $(GOINSTALL) ./...
acceptance: install acceptance: build
make -C ./spec/test cd ./spec/test && PATH=$(PWD)/build:$$PATH nix-shell --command make
deps: deps:
@echo "deps here" @echo "deps here"
+32 -6
View File
@@ -130,6 +130,7 @@ whole lifetime of the server.
* Kapow! implementations should follow a general principle of robustness: be * Kapow! implementations should follow a general principle of robustness: be
conservative in what you do, be liberal in what you accept from others. conservative in what you do, be liberal in what you accept from others.
* We reuse conventions of well-established software projects, such as Docker. * We reuse conventions of well-established software projects, such as Docker.
* Secure by default, the Control API can *only* be accessed using mTLS.
* All requests and responses will leverage JSON as the data encoding method. * All requests and responses will leverage JSON as the data encoding method.
* The API calls responses have several parts: * The API calls responses have several parts:
* The HTTP status code (e.g., `400`, which is a bad request). The target * The HTTP status code (e.g., `400`, which is a bad request). The target
@@ -178,6 +179,30 @@ Content-Length: 25
``` ```
## mTLS
The Kapow! server generates a pair of keys and certificates, one for the
server, the other for the configuring client. The necessary elements will be
communicated to the client (the init program) via a set of environment
variables.
The aforementioned variables are named:
- `KAPOW_CONTROL_SERVER_CERT`: server certificate.
- `KAPOW_CONTROL_CLIENT_CERT`: client certificate.
- `KAPOW_CONTROL_CLIENT_KEY`: client private key.
Note that all variables contain x509 PEM-encoded values.
Also note that the server private key is not communicated in any way.
Following the mTLS discipline, the client must ensure upon connecting to the
server that its certificate matches the one stored in
`KAPOW_CONTROL_SERVER_CERT`.
Conversely, the server must only communicate with clients whose certificate
matches the one stored in `KAPOW_CONTROL_CLIENT_CERT`.
## API Elements ## API Elements
Kapow! provides a way to control its internal state through these elements. Kapow! provides a way to control its internal state through these elements.
@@ -606,8 +631,6 @@ Commands:
``` ```
### `kapow server`
This command runs the Kapow! server, which is the core of Kapow!. If This command runs the Kapow! server, which is the core of Kapow!. If
run without parameters, it will run an unconfigured server. It can accept a path run without parameters, it will run an unconfigured server. It can accept a path
to an executable file, the init program, which can be a shell script that to an executable file, the init program, which can be a shell script that
@@ -615,7 +638,7 @@ contains commands to configure the *Kapow!* server.
The init program can leverage the `kapow route` command, which is used to define The init program can leverage the `kapow route` command, which is used to define
a route. The `kapow route` command needs a way to reach the *Kapow!* server, a route. The `kapow route` command needs a way to reach the *Kapow!* server,
and for that, `kapow` provides the `KAPOW_DATA_URL` variable in the environment and for that, `kapow` provides the `KAPOW_CONTROL_URL` variable in the environment
of the aforementioned init program. of the aforementioned init program.
Every time the *Kapow!* server receives a request, it will spawn a process to Every time the *Kapow!* server receives a request, it will spawn a process to
@@ -655,7 +678,10 @@ To deregister a route you must provide a *route_id*.
#### **Environment** #### **Environment**
- `KAPOW_DATA_URL` - `KAPOW_CONTROL_URL`
- `KAPOW_CONTROL_SERVER_CERT`
- `KAPOW_CONTROL_CLIENT_CERT`
- `KAPOW_CONTROL_CLIENT_KEY`
#### **Help** #### **Help**
@@ -696,7 +722,7 @@ Options:
$ kapow route add -X GET '/list/{ip}' -c 'nmap -sL $(kapow get /request/matches/ip) | kapow set /response/body' $ kapow route add -X GET '/list/{ip}' -c 'nmap -sL $(kapow get /request/matches/ip) | kapow set /response/body'
``` ```
### `request` ### `kapow get`
Exposes the requests' resources. Exposes the requests' resources.
@@ -713,7 +739,7 @@ $ kapow get /request/body
``` ```
### `response` ### `kapow set`
Exposes the response's resources. Exposes the response's resources.
+1
View File
@@ -0,0 +1 @@
use_nix
+10 -8
View File
@@ -1,16 +1,18 @@
FROM python:3.7-alpine FROM nixos/nix:2.3.6
# Install CircleCI requirements for base images # Install CircleCI requirements for base images
# https://circleci.com/docs/2.0/custom-images/ # https://circleci.com/docs/2.0/custom-images/
RUN apk upgrade --update-cache \ # RUN apk upgrade --update-cache \
&& apk add git openssh-server tar gzip ca-certificates # && apk add git openssh-server tar gzip ca-certificates
# Install Kapow! Spec Test Suite # Install Kapow! Spec Test Suite
RUN mkdir -p /usr/src/ksts RUN mkdir -p /usr/src/ksts
WORKDIR /usr/src/ksts WORKDIR /usr/src/ksts
COPY features /usr/src/ksts/features COPY features /usr/src/ksts/features
COPY Pipfile Pipfile.lock /usr/src/ksts/ # COPY Pipfile Pipfile.lock /usr/src/ksts/
RUN pip install --upgrade pip \ # RUN pip install --upgrade pip \
&& pip install pipenv \ # && pip install pipenv \
&& pipenv install --deploy --system \ # && pipenv install --deploy --system \
&& rm -f Pipfile Pipfile.lock # && rm -f Pipfile Pipfile.lock
COPY ./*.nix ./
ENTRYPOINT [ "nix-shell", "--command" ]
+10 -13
View File
@@ -1,23 +1,20 @@
.PHONY: lint wip test fix catalog sync .PHONY: all lint wip test fix catalog
all: checkbin sync test all: checkbin test
sync:
pipenv sync
lint: lint:
gherkin-lint gherkin-lint
wip: wip:
KAPOW_DEBUG_TESTS=1 pipenv run behave --stop --wip KAPOW_DEBUG_TESTS=1 behave --stop --wip -k
test: lint test: lint
pipenv run behave --no-capture --tags=~@skip behave --no-capture --tags=~@skip
fix: lint fix: lint
KAPOW_DEBUG_TESTS=1 pipenv run behave --stop --no-capture --tags=~@skip KAPOW_DEBUG_TESTS=1 behave --stop --no-capture --tags=~@skip
catalog: catalog:
pipenv run behave --format steps.usage --dry-run --no-summary -q behave --format steps.usage --dry-run --no-summary -q
clean:
pipenv --rm
checkbin: checkbin:
@which kapow >/dev/null || (echo "ERROR: Your kapow binary is not present in PATH" && exit 1) @which kapow >/dev/null || (echo "ERROR: Your kapow binary is not present in PATH" && exit 1)
testpoc: sync testpoc:
pipenv run pip install -r ../../testutils/poc/requirements.txt PATH=../../testutils/poc:$$PATH behave --no-capture --tags=~@skip
PATH=../../testutils/poc:$$PATH KAPOW_CONTROL_URL=http://localhost:8081 KAPOW_DATA_URL=http://localhost:8081 pipenv run behave --no-capture --tags=~@skip wippoc:
PATH=../../testutils/poc:$$PATH behave --no-capture --tags=@wip -k
@@ -13,6 +13,7 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
# #
@server
Feature: Listing routes in a Kapow! server. Feature: Listing routes in a Kapow! server.
Listing routes allows users to know what URLs are Listing routes allows users to know what URLs are
available on a Kapow! server. The List endpoint returns available on a Kapow! server. The List endpoint returns
+95
View File
@@ -0,0 +1,95 @@
#
# Copyright 2021 Banco Bilbao Vizcaya Argentaria, S.A.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
Feature: Communications with the control interface are secured with mTLS.
Trust is anchored via certificate pinning.
The Kapow! server only allows connections from trusted clients.
The Kapow! clients only establish connections to trusted servers.
@server
Scenario: Reject clients not providing a certificate.
Given I have a running Kapow! server
When I try to connect to the control API without providing a certificate
Then I get a connection error
@server
Scenario: Reject clients providing an invalid certificate.
Given I have a running Kapow! server
When I try to connect to the control API providing an invalid certificate
Then I get a connection error
@client
Scenario: Connect to servers providing a valid certificate.
A valid certificate is the one provided via envvars.
Given a test HTTPS server on the control port
When I run the following command
"""
$ kapow route list
"""
And the HTTPS server receives a "GET" request to "/routes"
And the server responds with
| field | value |
| status | 200 |
| headers.Content-Type | application/json |
| body | [] |
Then the command exits with "0"
@client
Scenario: Reject servers providing an invalid certificate.
Given a test HTTPS server on the control port
When I run the following command (with invalid certs)
"""
$ kapow route list
"""
Then the command exits immediately with "1"
@server
Scenario Outline: The control server is accessible through an alternative address
The automatically generated certificated contains the Alternate Name
provided via the `--control-reachable-addr` parameter.
Given I launch the server with the following extra arguments
"""
--control-reachable-addr "<reachable_addr>"
"""
When I inspect the automatically generated control server certificate
Then the extension "Subject Alternative Name" contains "<value>" of type "<type>"
Examples:
| reachable_addr | value | type |
| localhost:8081 | localhost | DNSName |
| 127.0.0.1:8081 | 127.0.0.1 | IPAddress |
| foo.bar:8081 | foo.bar | DNSName |
| 4.2.2.4:8081 | 4.2.2.4 | IPAddress |
| [2600::]:8081 | 2600:: | IPAddress |
@e2e
Scenario: Control server dialog using mTLS
If the user provides the corresponding certificates to the
`kapow route` subcommand, the communication should be possible.
Given I have a just started Kapow! server
When I run the following command (setting the control certs environment variables)
"""
$ kapow route list
"""
Then the command exits with "0"
+24 -6
View File
@@ -15,25 +15,43 @@
# #
import tempfile import tempfile
import os import os
import signal
from contextlib import suppress
def tmpfifo():
def before_scenario(context, scenario):
# Create the request_handler FIFO
while True: while True:
context.handler_fifo_path = tempfile.mktemp() # Safe because using fifo_path = tempfile.mktemp() # The usage mkfifo make this safe
# mkfifo
try: try:
os.mkfifo(context.handler_fifo_path) os.mkfifo(fifo_path)
except OSError: except OSError:
# The file already exist # The file already exist
pass pass
else: else:
break break
return fifo_path
def before_scenario(context, scenario):
context.handler_fifo_path = tmpfifo()
context.init_script_fifo_path = tmpfifo()
def after_scenario(context, scenario): def after_scenario(context, scenario):
# Real Kapow! server being tested
if hasattr(context, 'server'): if hasattr(context, 'server'):
context.server.terminate() context.server.terminate()
context.server.wait() context.server.wait()
os.unlink(context.handler_fifo_path) os.unlink(context.handler_fifo_path)
os.unlink(context.init_script_fifo_path)
# Mock HTTP server for testing
if hasattr(context, 'httpserver'):
context.response_ready.set()
context.httpserver.shutdown()
context.httpserver_thread.join()
if getattr(context, 'testing_handler_pid', None) is not None:
with suppress(ProcessLookupError):
os.kill(int(context.testing_handler_pid), signal.SIGTERM)
+9
View File
@@ -0,0 +1,9 @@
#!/usr/bin/env python
import json
import os
import sys
if __name__ == '__main__':
with open(os.environ['SPECTEST_FIFO'], 'w') as fifo:
json.dump(dict(os.environ), fifo)
+396 -56
View File
@@ -13,26 +13,36 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
# #
from contextlib import suppress from contextlib import suppress, contextmanager
from multiprocessing.pool import ThreadPool
from time import sleep from time import sleep
import datetime
import http.server
import ipaddress
import json import json
import logging
import os import os
import shlex import shlex
import signal import signal
import socket import socket
import ssl
import subprocess import subprocess
import sys import sys
import tempfile import tempfile
import threading import threading
from multiprocessing.pool import ThreadPool
import time import time
import requests
from environconfig import EnvironConfig, StringVar, IntVar, BooleanVar
from comparedict import is_subset from comparedict import is_subset
from cryptography.hazmat.backends import default_backend
from cryptography.hazmat.primitives.asymmetric import rsa
from cryptography.hazmat.primitives import hashes
from cryptography.hazmat.primitives import serialization
from cryptography import x509
from cryptography.x509.oid import NameOID, ExtensionOID
from environconfig import EnvironConfig, StringVar, IntVar, BooleanVar
from requests import exceptions as requests_exceptions
import jsonexample import jsonexample
import requests
import logging
WORD2POS = {"first": 0, "second": 1, "last": -1} WORD2POS = {"first": 0, "second": 1, "last": -1}
@@ -44,7 +54,8 @@ class Env(EnvironConfig):
KAPOW_SERVER_CMD = StringVar(default="kapow server") KAPOW_SERVER_CMD = StringVar(default="kapow server")
#: Where the Control API is #: Where the Control API is
KAPOW_CONTROL_URL = StringVar(default="http://localhost:8081") KAPOW_CONTROL_URL = StringVar(default="https://localhost:8081")
KAPOW_CONTROL_PORT = IntVar(default=8081)
#: Where the Data API is #: Where the Data API is
KAPOW_DATA_URL = StringVar(default="http://localhost:8082") KAPOW_DATA_URL = StringVar(default="http://localhost:8082")
@@ -52,7 +63,9 @@ class Env(EnvironConfig):
#: Where the User Interface is #: Where the User Interface is
KAPOW_USER_URL = StringVar(default="http://localhost:8080") KAPOW_USER_URL = StringVar(default="http://localhost:8080")
KAPOW_BOOT_TIMEOUT = IntVar(default=1000) KAPOW_CONTROL_TOKEN = StringVar(default="TEST-SPEC-CONTROL-TOKEN")
KAPOW_BOOT_TIMEOUT = IntVar(default=3000)
KAPOW_DEBUG_TESTS = BooleanVar(default=False) KAPOW_DEBUG_TESTS = BooleanVar(default=False)
@@ -77,37 +90,134 @@ if Env.KAPOW_DEBUG_TESTS:
requests_log.setLevel(logging.DEBUG) requests_log.setLevel(logging.DEBUG)
requests_log.propagate = True requests_log.propagate = True
def run_kapow_server(context):
def generate_ssl_cert(subject_name, alternate_name):
# Generate our key
key = rsa.generate_private_key(
public_exponent=65537,
key_size=4096,
)
# Various details about who we are. For a self-signed certificate the
# subject and issuer are always the same.
subject = issuer = x509.Name([
x509.NameAttribute(NameOID.COMMON_NAME, subject_name),
])
cert = x509.CertificateBuilder().subject_name(
subject
).issuer_name(
issuer
).public_key(
key.public_key()
).serial_number(
x509.random_serial_number()
).not_valid_before(
datetime.datetime.utcnow()
).not_valid_after(
# Our certificate will be valid for 10 days
datetime.datetime.utcnow() + datetime.timedelta(days=10)
).add_extension(
x509.SubjectAlternativeName([x509.DNSName(alternate_name)]),
critical=True,
).add_extension(
x509.ExtendedKeyUsage(
[x509.oid.ExtendedKeyUsageOID.SERVER_AUTH
if subject_name.endswith('_server')
else x509.oid.ExtendedKeyUsageOID.CLIENT_AUTH]),
critical=True,
# Sign our certificate with our private key
).sign(key, hashes.SHA256())
key_bytes = key.private_bytes(
encoding=serialization.Encoding.PEM,
format=serialization.PrivateFormat.TraditionalOpenSSL,
encryption_algorithm=serialization.NoEncryption()
)
crt_bytes = cert.public_bytes(serialization.Encoding.PEM)
return (key_bytes, crt_bytes)
@contextmanager
def mtls_client(context):
with tempfile.NamedTemporaryFile(suffix='.crt', encoding='utf-8', mode='w') as srv_cert, \
tempfile.NamedTemporaryFile(suffix='.crt', encoding='utf-8', mode='w') as cli_cert, \
tempfile.NamedTemporaryFile(suffix='.key', encoding='utf-8', mode='w') as cli_key:
srv_cert.write(context.init_script_environ["KAPOW_CONTROL_SERVER_CERT"])
srv_cert.file.flush()
cli_cert.write(context.init_script_environ["KAPOW_CONTROL_CLIENT_CERT"])
cli_cert.file.flush()
cli_key.write(context.init_script_environ["KAPOW_CONTROL_CLIENT_KEY"])
cli_key.file.flush()
session=requests.Session()
session.verify=srv_cert.name
session.cert=(cli_cert.name, cli_key.name)
yield session
def is_port_open(port):
with socket.socket(socket.AF_INET, socket.SOCK_STREAM) as sock:
return sock.connect_ex(('127.0.0.1', port)) == 0
def run_kapow_server(context, extra_args=""):
assert (not is_port_open(Env.KAPOW_CONTROL_PORT)), "Another process is already bound"
context.server = subprocess.Popen( context.server = subprocess.Popen(
shlex.split(Env.KAPOW_SERVER_CMD), shlex.split(Env.KAPOW_SERVER_CMD) + shlex.split(extra_args) + [os.path.join(HERE, "get_environment.py")],
stdout=subprocess.DEVNULL, stdout=subprocess.DEVNULL,
stderr=subprocess.DEVNULL, stderr=subprocess.DEVNULL,
env={'SPECTEST_FIFO': context.init_script_fifo_path, **os.environ},
shell=False) shell=False)
# Check process is running with reachable APIs # Check process is running with reachable APIs
open_ports = False open_ports = False
for _ in range(Env.KAPOW_BOOT_TIMEOUT): for _ in range(Env.KAPOW_BOOT_TIMEOUT):
is_running = context.server.poll() is None with suppress(requests_exceptions.ConnectionError):
assert is_running, "Server is not running!" if is_port_open(Env.KAPOW_CONTROL_PORT):
with suppress(requests.exceptions.ConnectionError): open_ports = True
open_ports = (
requests.head(Env.KAPOW_CONTROL_URL, timeout=1).status_code
and requests.head(Env.KAPOW_DATA_URL, timeout=1).status_code)
if open_ports:
break break
sleep(.01) sleep(.01)
assert open_ports, "API is unreachable after KAPOW_BOOT_TIMEOUT" assert open_ports, "API is unreachable after KAPOW_BOOT_TIMEOUT"
# Get init_script enviroment via fifo
with open(context.init_script_fifo_path, 'r') as fifo:
context.init_script_environ = json.load(fifo)
@given('I have a just started Kapow! server') @given('I have a just started Kapow! server')
@given('I have a running Kapow! server') @given('I have a running Kapow! server')
def step_impl(context): def step_impl(context):
run_kapow_server(context) run_kapow_server(context)
@given(u'I launch the server with the following extra arguments')
def step_impl(context):
run_kapow_server(context, context.text)
@when('I request a route listing without providing a Control Access Token')
def step_impl(context):
with mtls_client(context) as requests:
context.response = requests.get(f"{Env.KAPOW_CONTROL_URL}/routes")
@when('I request a route listing without providing an empty Control Access Token')
def step_impl(context):
with mtls_client(context) as requests:
context.response = requests.get(f"{Env.KAPOW_CONTROL_URL}/routes")
@when(u'I request a route listing providing a bad Control Access Token')
def step_impl(context):
with mtls_client(context) as requests:
context.response = requests.get(f"{Env.KAPOW_CONTROL_URL}/routes")
@when('I request a routes listing') @when('I request a routes listing')
def step_impl(context): def step_impl(context):
context.response = requests.get(f"{Env.KAPOW_CONTROL_URL}/routes") with mtls_client(context) as requests:
context.response = requests.get(f"{Env.KAPOW_CONTROL_URL}/routes")
@given('I have a Kapow! server with the following routes') @given('I have a Kapow! server with the following routes')
@@ -117,10 +227,12 @@ def step_impl(context):
if not hasattr(context, 'table'): if not hasattr(context, 'table'):
raise RuntimeError("A table must be set for this step.") raise RuntimeError("A table must be set for this step.")
for row in context.table: with mtls_client(context) as requests:
response = requests.post(f"{Env.KAPOW_CONTROL_URL}/routes", for row in context.table:
json={h: row[h] for h in row.headings}) response = requests.post(
response.raise_for_status() f"{Env.KAPOW_CONTROL_URL}/routes",
json={h: row[h] for h in row.headings})
response.raise_for_status()
@given('I have a Kapow! server with the following testing routes') @given('I have a Kapow! server with the following testing routes')
@@ -130,15 +242,16 @@ def step_impl(context):
if not hasattr(context, 'table'): if not hasattr(context, 'table'):
raise RuntimeError("A table must be set for this step.") raise RuntimeError("A table must be set for this step.")
for row in context.table: with mtls_client(context) as requests:
response = requests.post( for row in context.table:
f"{Env.KAPOW_CONTROL_URL}/routes", response = requests.post(
json={"entrypoint": " ".join( f"{Env.KAPOW_CONTROL_URL}/routes",
[sys.executable, json={"entrypoint": " ".join(
shlex.quote(os.path.join(HERE, "testinghandler.py")), [sys.executable,
shlex.quote(context.handler_fifo_path)]), # Created in before_scenario shlex.quote(os.path.join(HERE, "testinghandler.py")),
**{h: row[h] for h in row.headings}}) shlex.quote(context.handler_fifo_path)]), # Created in before_scenario
response.raise_for_status() **{h: row[h] for h in row.headings}})
response.raise_for_status()
def testing_request(context, request_fn): def testing_request(context, request_fn):
# Run the request in background # Run the request in background
@@ -165,15 +278,17 @@ def step_impl(context, path):
@when('I release the testing request') @when('I release the testing request')
def step_impl(context): def step_impl(context):
os.kill(int(context.testing_handler_pid), signal.SIGTERM) os.kill(int(context.testing_handler_pid), signal.SIGTERM)
context.testing_handler_pid = None
context.testing_response = context.testing_request.get() context.testing_response = context.testing_request.get()
@when('I append the route') @when('I append the route')
def step_impl(context): def step_impl(context):
context.response = requests.post(f"{Env.KAPOW_CONTROL_URL}/routes", with mtls_client(context) as requests:
data=context.text, context.response = requests.post(
headers={"Content-Type": "application/json"}) f"{Env.KAPOW_CONTROL_URL}/routes",
data=context.text,
headers={"Content-Type": "application/json"})
@then('I get {code} as response code') @then('I get {code} as response code')
def step_impl(context, code): def step_impl(context, code):
@@ -212,50 +327,62 @@ def step_impl(context):
@when('I delete the route with id "{id}"') @when('I delete the route with id "{id}"')
def step_impl(context, id): def step_impl(context, id):
context.response = requests.delete(f"{Env.KAPOW_CONTROL_URL}/routes/{id}") with mtls_client(context) as requests:
context.response = requests.delete(
f"{Env.KAPOW_CONTROL_URL}/routes/{id}")
@when('I insert the route') @when('I insert the route')
def step_impl(context): def step_impl(context):
context.response = requests.put(f"{Env.KAPOW_CONTROL_URL}/routes", with mtls_client(context) as requests:
headers={"Content-Type": "application/json"}, context.response = requests.put(
data=context.text) f"{Env.KAPOW_CONTROL_URL}/routes",
headers={"Content-Type": "application/json"},
data=context.text)
@when('I try to append with this malformed JSON document') @when('I try to append with this malformed JSON document')
def step_impl(context): def step_impl(context):
context.response = requests.post( with mtls_client(context) as requests:
f"{Env.KAPOW_CONTROL_URL}/routes", context.response = requests.post(
headers={"Content-Type": "application/json"}, f"{Env.KAPOW_CONTROL_URL}/routes",
data=context.text) headers={"Content-Type": "application/json"},
data=context.text)
@when('I delete the {order} route') @when('I delete the {order} route')
def step_impl(context, order): def step_impl(context, order):
idx = WORD2POS.get(order) with mtls_client(context) as requests:
routes = requests.get(f"{Env.KAPOW_CONTROL_URL}/routes") idx = WORD2POS.get(order)
id = routes.json()[idx]["id"] routes = requests.get(f"{Env.KAPOW_CONTROL_URL}/routes")
context.response = requests.delete(f"{Env.KAPOW_CONTROL_URL}/routes/{id}") id = routes.json()[idx]["id"]
context.response = requests.delete(
f"{Env.KAPOW_CONTROL_URL}/routes/{id}")
@when('I try to insert with this JSON document') @when('I try to insert with this JSON document')
def step_impl(context): def step_impl(context):
context.response = requests.put( with mtls_client(context) as requests:
f"{Env.KAPOW_CONTROL_URL}/routes", context.response = requests.put(
headers={"Content-Type": "application/json"}, f"{Env.KAPOW_CONTROL_URL}/routes",
data=context.text) headers={"Content-Type": "application/json"},
data=context.text)
@when('I get the route with id "{id}"') @when('I get the route with id "{id}"')
def step_impl(context, id): def step_impl(context, id):
context.response = requests.get(f"{Env.KAPOW_CONTROL_URL}/routes/{id}") with mtls_client(context) as requests:
context.response = requests.get(
f"{Env.KAPOW_CONTROL_URL}/routes/{id}")
@when('I get the {order} route') @when('I get the {order} route')
def step_impl(context, order): def step_impl(context, order):
idx = WORD2POS.get(order) with mtls_client(context) as requests:
routes = requests.get(f"{Env.KAPOW_CONTROL_URL}/routes") idx = WORD2POS.get(order)
id = routes.json()[idx]["id"] routes = requests.get(f"{Env.KAPOW_CONTROL_URL}/routes")
context.response = requests.get(f"{Env.KAPOW_CONTROL_URL}/routes/{id}") id = routes.json()[idx]["id"]
context.response = requests.get(
f"{Env.KAPOW_CONTROL_URL}/routes/{id}")
@when('I get the resource "{resource}"') @when('I get the resource "{resource}"')
@@ -316,3 +443,216 @@ def step_impl(context, value, fieldType, elementName):
raise ValueError("Unknown fieldtype {fieldType!r}") raise ValueError("Unknown fieldtype {fieldType!r}")
assert actual == value, f"Expecting {fieldType} {elementName!r} to be {value!r}, got {actual!r} insted" assert actual == value, f"Expecting {fieldType} {elementName!r} to be {value!r}, got {actual!r} insted"
@given('a test HTTPS server on the {port} port')
def step_impl(context, port):
context.request_ready = threading.Event()
context.request_ready.clear()
context.response_ready = threading.Event()
context.response_ready.clear()
class SaveResponseHandler(http.server.BaseHTTPRequestHandler):
def do_verb(self):
context.request_response = self
context.request_ready.set()
context.response_ready.wait()
do_GET=do_verb
do_POST=do_verb
do_PUT=do_verb
do_DELETE=do_verb
do_HEAD=do_verb
if port == "control":
port = 8081
elif port == "data":
port = 8082
else:
raise ValueError(f"Unknown port {port}")
context.httpserver = http.server.HTTPServer(('127.0.0.1', port),
SaveResponseHandler)
context.srv_key, context.srv_crt = generate_ssl_cert("control_server", "localhost")
context.cli_key, context.cli_crt = generate_ssl_cert("control_client", "localhost")
with tempfile.NamedTemporaryFile(suffix=".key") as key_file, \
tempfile.NamedTemporaryFile(suffix=".crt") as crt_file:
key_file.write(context.srv_key)
key_file.flush()
crt_file.write(context.srv_crt)
crt_file.flush()
context.httpserver.socket = ssl.wrap_socket(
context.httpserver.socket,
keyfile=key_file.name,
certfile=crt_file.name,
server_side=True)
context.httpserver_thread = threading.Thread(
target=context.httpserver.serve_forever,
daemon=True)
context.httpserver_thread.start()
def run_command_with_certs(context, srv_crt, cli_crt, cli_key):
_, command = context.text.split('$')
command = command.lstrip()
def exec_in_thread():
context.command = subprocess.Popen(
command,
shell=True,
env={'KAPOW_CONTROL_SERVER_CERT': srv_crt,
'KAPOW_CONTROL_CLIENT_CERT': cli_crt,
'KAPOW_CONTROL_CLIENT_KEY': cli_key,
**os.environ})
context.command.wait()
context.command_thread = threading.Thread(target=exec_in_thread, daemon=True)
context.command_thread.start()
@step('I run the following command (with invalid certs)')
def step_impl(context):
invalid_srv_crt, _ = generate_ssl_cert("invalid_control_server",
"localhost")
run_command_with_certs(context,
invalid_srv_crt,
context.cli_crt,
context.cli_key)
@step('I run the following command')
def step_impl(context):
run_command_with_certs(context,
context.srv_crt,
context.cli_crt,
context.cli_key)
@when('I run the following command (setting the control certs environment variables)')
def step_impl(context):
run_command_with_certs(
context,
context.init_script_environ["KAPOW_CONTROL_SERVER_CERT"],
context.init_script_environ["KAPOW_CONTROL_CLIENT_CERT"],
context.init_script_environ["KAPOW_CONTROL_CLIENT_KEY"])
@step('the HTTPS server receives a "{method}" request to "{path}"')
def step_impl(context, method, path):
context.request_ready.wait()
assert context.request_response.command == method, f"Method {context.request_response.command} is not {method}"
assert context.request_response.path == path, f"Method {context.request_response.path} is not {path}"
@then('the received request has the header "{name}" set to "{value}"')
def step_impl(context, name, value):
context.request_ready.wait()
matching = context.request_response.headers[name]
assert matching, f"Header {name} not found"
assert matching == value, f"Value of header doesn't match. {matching} != {value}"
@when('the server responds with')
def step_impl(context):
# TODO: set the fields given in the table
has_body = False
for row in context.table:
if row['field'] == 'status':
context.request_response.send_response(int(row['value']))
elif row['field'].startswith('headers.'):
_, header = row['field'].split('.')
context.request_response.send_header(header, row['value'])
elif row['field'] == 'body':
has_body = True
payload = row['value'].encode('utf-8')
context.request_response.send_header('Content-Length', str(len(payload)))
context.request_response.end_headers()
context.request_response.wfile.write(payload)
if not has_body:
context.request_response.send_header('Content-Length', '0')
context.request_response.end_headers()
context.response_ready.set()
@then('the command exits {immediately} with "{returncode}"')
@then('the command exits with "{returncode}"')
def step_impl(context, returncode, immediately=False):
context.command_thread.join(timeout=3.0 if immediately else None)
if context.command_thread.is_alive():
try:
print("killing in the name of")
context.command.kill()
finally:
assert False, "The command is still alive"
else:
context.command.wait()
assert context.command.returncode == int(returncode), f"Command returned {context.command.returncode} instead of {returncode}"
@then('the received request doesn\'t have the header "{name}" set')
def step_impl(context, name):
context.request_ready.wait()
assert name not in context.request_response.headers, f"Header {name} found"
@when('I try to connect to the control API without providing a certificate')
def step_impl(context):
try:
context.request_response = requests.get(f"{Env.KAPOW_CONTROL_URL}/routes", verify=False)
except Exception as exc:
context.request_response = exc
@then(u'I get a connection error')
def step_impl(context):
assert issubclass(type(context.request_response), Exception), context.request_response
@when(u'I try to connect to the control API providing an invalid certificate')
def step_impl(context):
key, cert = generate_ssl_cert("foo", "localhost")
with tempfile.NamedTemporaryFile(suffix='.crt') as cert_file, \
tempfile.NamedTemporaryFile(suffix='.key') as key_file:
cert_file.write(cert)
cert_file.flush()
key_file.write(key)
key_file.flush()
with requests.Session() as session:
session.cert = (cert_file.name, key_file.name)
session.verify = False
try:
context.request_response = session.get(
f"{Env.KAPOW_CONTROL_URL}/routes")
except Exception as exc:
context.request_response = exc
@when('I inspect the automatically generated control server certificate')
def step_impl(context):
context.control_server_cert = x509.load_pem_x509_certificate(
context.init_script_environ["KAPOW_CONTROL_SERVER_CERT"].encode('ascii'))
@then('the extension "{extension}" contains "{value}" of type "{typename}"')
def step_impl(context, extension, value, typename):
if extension == 'Subject Alternative Name':
oid = ExtensionOID.SUBJECT_ALTERNATIVE_NAME
else:
raise NotImplementedError(f'Unknown extension {extension}')
if typename == 'DNSName':
type_ = x509.DNSName
converter = lambda x: x
elif typename == 'IPAddress':
type_ = x509.IPAddress
converter = ipaddress.ip_address
else:
raise NotImplementedError(f'Unknown type {typename}')
ext = context.control_server_cert.extensions.get_extension_for_oid(oid)
values = ext.value.get_values_for_type(type_)
assert converter(value) in values, f"Value {value} not in {values}"
+17
View File
@@ -0,0 +1,17 @@
# This file has been generated by node2nix 1.8.0. Do not edit!
{pkgs ? import <nixpkgs> {
inherit system;
}, system ? builtins.currentSystem, nodejs ? pkgs."nodejs-12_x"}:
let
nodeEnv = import ./node-env.nix {
inherit (pkgs) stdenv python2 utillinux runCommand writeTextFile;
inherit nodejs;
libtool = if pkgs.stdenv.isDarwin then pkgs.darwin.cctools else null;
};
in
import ./node-packages.nix {
inherit (pkgs) fetchurl fetchgit;
inherit nodeEnv;
}
+542
View File
@@ -0,0 +1,542 @@
# This file originates from node2nix
{stdenv, nodejs, python2, utillinux, libtool, runCommand, writeTextFile}:
let
python = if nodejs ? python then nodejs.python else python2;
# Create a tar wrapper that filters all the 'Ignoring unknown extended header keyword' noise
tarWrapper = runCommand "tarWrapper" {} ''
mkdir -p $out/bin
cat > $out/bin/tar <<EOF
#! ${stdenv.shell} -e
$(type -p tar) "\$@" --warning=no-unknown-keyword --delay-directory-restore
EOF
chmod +x $out/bin/tar
'';
# Function that generates a TGZ file from a NPM project
buildNodeSourceDist =
{ name, version, src, ... }:
stdenv.mkDerivation {
name = "node-tarball-${name}-${version}";
inherit src;
buildInputs = [ nodejs ];
buildPhase = ''
export HOME=$TMPDIR
tgzFile=$(npm pack | tail -n 1) # Hooks to the pack command will add output (https://docs.npmjs.com/misc/scripts)
'';
installPhase = ''
mkdir -p $out/tarballs
mv $tgzFile $out/tarballs
mkdir -p $out/nix-support
echo "file source-dist $out/tarballs/$tgzFile" >> $out/nix-support/hydra-build-products
'';
};
includeDependencies = {dependencies}:
stdenv.lib.optionalString (dependencies != [])
(stdenv.lib.concatMapStrings (dependency:
''
# Bundle the dependencies of the package
mkdir -p node_modules
cd node_modules
# Only include dependencies if they don't exist. They may also be bundled in the package.
if [ ! -e "${dependency.name}" ]
then
${composePackage dependency}
fi
cd ..
''
) dependencies);
# Recursively composes the dependencies of a package
composePackage = { name, packageName, src, dependencies ? [], ... }@args:
builtins.addErrorContext "while evaluating node package '${packageName}'" ''
DIR=$(pwd)
cd $TMPDIR
unpackFile ${src}
# Make the base dir in which the target dependency resides first
mkdir -p "$(dirname "$DIR/${packageName}")"
if [ -f "${src}" ]
then
# Figure out what directory has been unpacked
packageDir="$(find . -maxdepth 1 -type d | tail -1)"
# Restore write permissions to make building work
find "$packageDir" -type d -exec chmod u+x {} \;
chmod -R u+w "$packageDir"
# Move the extracted tarball into the output folder
mv "$packageDir" "$DIR/${packageName}"
elif [ -d "${src}" ]
then
# Get a stripped name (without hash) of the source directory.
# On old nixpkgs it's already set internally.
if [ -z "$strippedName" ]
then
strippedName="$(stripHash ${src})"
fi
# Restore write permissions to make building work
chmod -R u+w "$strippedName"
# Move the extracted directory into the output folder
mv "$strippedName" "$DIR/${packageName}"
fi
# Unset the stripped name to not confuse the next unpack step
unset strippedName
# Include the dependencies of the package
cd "$DIR/${packageName}"
${includeDependencies { inherit dependencies; }}
cd ..
${stdenv.lib.optionalString (builtins.substring 0 1 packageName == "@") "cd .."}
'';
pinpointDependencies = {dependencies, production}:
let
pinpointDependenciesFromPackageJSON = writeTextFile {
name = "pinpointDependencies.js";
text = ''
var fs = require('fs');
var path = require('path');
function resolveDependencyVersion(location, name) {
if(location == process.env['NIX_STORE']) {
return null;
} else {
var dependencyPackageJSON = path.join(location, "node_modules", name, "package.json");
if(fs.existsSync(dependencyPackageJSON)) {
var dependencyPackageObj = JSON.parse(fs.readFileSync(dependencyPackageJSON));
if(dependencyPackageObj.name == name) {
return dependencyPackageObj.version;
}
} else {
return resolveDependencyVersion(path.resolve(location, ".."), name);
}
}
}
function replaceDependencies(dependencies) {
if(typeof dependencies == "object" && dependencies !== null) {
for(var dependency in dependencies) {
var resolvedVersion = resolveDependencyVersion(process.cwd(), dependency);
if(resolvedVersion === null) {
process.stderr.write("WARNING: cannot pinpoint dependency: "+dependency+", context: "+process.cwd()+"\n");
} else {
dependencies[dependency] = resolvedVersion;
}
}
}
}
/* Read the package.json configuration */
var packageObj = JSON.parse(fs.readFileSync('./package.json'));
/* Pinpoint all dependencies */
replaceDependencies(packageObj.dependencies);
if(process.argv[2] == "development") {
replaceDependencies(packageObj.devDependencies);
}
replaceDependencies(packageObj.optionalDependencies);
/* Write the fixed package.json file */
fs.writeFileSync("package.json", JSON.stringify(packageObj, null, 2));
'';
};
in
''
node ${pinpointDependenciesFromPackageJSON} ${if production then "production" else "development"}
${stdenv.lib.optionalString (dependencies != [])
''
if [ -d node_modules ]
then
cd node_modules
${stdenv.lib.concatMapStrings (dependency: pinpointDependenciesOfPackage dependency) dependencies}
cd ..
fi
''}
'';
# Recursively traverses all dependencies of a package and pinpoints all
# dependencies in the package.json file to the versions that are actually
# being used.
pinpointDependenciesOfPackage = { packageName, dependencies ? [], production ? true, ... }@args:
''
if [ -d "${packageName}" ]
then
cd "${packageName}"
${pinpointDependencies { inherit dependencies production; }}
cd ..
${stdenv.lib.optionalString (builtins.substring 0 1 packageName == "@") "cd .."}
fi
'';
# Extract the Node.js source code which is used to compile packages with
# native bindings
nodeSources = runCommand "node-sources" {} ''
tar --no-same-owner --no-same-permissions -xf ${nodejs.src}
mv node-* $out
'';
# Script that adds _integrity fields to all package.json files to prevent NPM from consulting the cache (that is empty)
addIntegrityFieldsScript = writeTextFile {
name = "addintegrityfields.js";
text = ''
var fs = require('fs');
var path = require('path');
function augmentDependencies(baseDir, dependencies) {
for(var dependencyName in dependencies) {
var dependency = dependencies[dependencyName];
// Open package.json and augment metadata fields
var packageJSONDir = path.join(baseDir, "node_modules", dependencyName);
var packageJSONPath = path.join(packageJSONDir, "package.json");
if(fs.existsSync(packageJSONPath)) { // Only augment packages that exist. Sometimes we may have production installs in which development dependencies can be ignored
console.log("Adding metadata fields to: "+packageJSONPath);
var packageObj = JSON.parse(fs.readFileSync(packageJSONPath));
if(dependency.integrity) {
packageObj["_integrity"] = dependency.integrity;
} else {
packageObj["_integrity"] = "sha1-000000000000000000000000000="; // When no _integrity string has been provided (e.g. by Git dependencies), add a dummy one. It does not seem to harm and it bypasses downloads.
}
if(dependency.resolved) {
packageObj["_resolved"] = dependency.resolved; // Adopt the resolved property if one has been provided
} else {
packageObj["_resolved"] = dependency.version; // Set the resolved version to the version identifier. This prevents NPM from cloning Git repositories.
}
if(dependency.from !== undefined) { // Adopt from property if one has been provided
packageObj["_from"] = dependency.from;
}
fs.writeFileSync(packageJSONPath, JSON.stringify(packageObj, null, 2));
}
// Augment transitive dependencies
if(dependency.dependencies !== undefined) {
augmentDependencies(packageJSONDir, dependency.dependencies);
}
}
}
if(fs.existsSync("./package-lock.json")) {
var packageLock = JSON.parse(fs.readFileSync("./package-lock.json"));
if(packageLock.lockfileVersion !== 1) {
process.stderr.write("Sorry, I only understand lock file version 1!\n");
process.exit(1);
}
if(packageLock.dependencies !== undefined) {
augmentDependencies(".", packageLock.dependencies);
}
}
'';
};
# Reconstructs a package-lock file from the node_modules/ folder structure and package.json files with dummy sha1 hashes
reconstructPackageLock = writeTextFile {
name = "addintegrityfields.js";
text = ''
var fs = require('fs');
var path = require('path');
var packageObj = JSON.parse(fs.readFileSync("package.json"));
var lockObj = {
name: packageObj.name,
version: packageObj.version,
lockfileVersion: 1,
requires: true,
dependencies: {}
};
function augmentPackageJSON(filePath, dependencies) {
var packageJSON = path.join(filePath, "package.json");
if(fs.existsSync(packageJSON)) {
var packageObj = JSON.parse(fs.readFileSync(packageJSON));
dependencies[packageObj.name] = {
version: packageObj.version,
integrity: "sha1-000000000000000000000000000=",
dependencies: {}
};
processDependencies(path.join(filePath, "node_modules"), dependencies[packageObj.name].dependencies);
}
}
function processDependencies(dir, dependencies) {
if(fs.existsSync(dir)) {
var files = fs.readdirSync(dir);
files.forEach(function(entry) {
var filePath = path.join(dir, entry);
var stats = fs.statSync(filePath);
if(stats.isDirectory()) {
if(entry.substr(0, 1) == "@") {
// When we encounter a namespace folder, augment all packages belonging to the scope
var pkgFiles = fs.readdirSync(filePath);
pkgFiles.forEach(function(entry) {
if(stats.isDirectory()) {
var pkgFilePath = path.join(filePath, entry);
augmentPackageJSON(pkgFilePath, dependencies);
}
});
} else {
augmentPackageJSON(filePath, dependencies);
}
}
});
}
}
processDependencies("node_modules", lockObj.dependencies);
fs.writeFileSync("package-lock.json", JSON.stringify(lockObj, null, 2));
'';
};
prepareAndInvokeNPM = {packageName, bypassCache, reconstructLock, npmFlags, production}:
let
forceOfflineFlag = if bypassCache then "--offline" else "--registry http://www.example.com";
in
''
# Pinpoint the versions of all dependencies to the ones that are actually being used
echo "pinpointing versions of dependencies..."
source $pinpointDependenciesScriptPath
# Patch the shebangs of the bundled modules to prevent them from
# calling executables outside the Nix store as much as possible
patchShebangs .
# Deploy the Node.js package by running npm install. Since the
# dependencies have been provided already by ourselves, it should not
# attempt to install them again, which is good, because we want to make
# it Nix's responsibility. If it needs to install any dependencies
# anyway (e.g. because the dependency parameters are
# incomplete/incorrect), it fails.
#
# The other responsibilities of NPM are kept -- version checks, build
# steps, postprocessing etc.
export HOME=$TMPDIR
cd "${packageName}"
runHook preRebuild
${stdenv.lib.optionalString bypassCache ''
${stdenv.lib.optionalString reconstructLock ''
if [ -f package-lock.json ]
then
echo "WARNING: Reconstruct lock option enabled, but a lock file already exists!"
echo "This will most likely result in version mismatches! We will remove the lock file and regenerate it!"
rm package-lock.json
else
echo "No package-lock.json file found, reconstructing..."
fi
node ${reconstructPackageLock}
''}
node ${addIntegrityFieldsScript}
''}
npm ${forceOfflineFlag} --nodedir=${nodeSources} ${npmFlags} ${stdenv.lib.optionalString production "--production"} rebuild
if [ "''${dontNpmInstall-}" != "1" ]
then
# NPM tries to download packages even when they already exist if npm-shrinkwrap is used.
rm -f npm-shrinkwrap.json
npm ${forceOfflineFlag} --nodedir=${nodeSources} ${npmFlags} ${stdenv.lib.optionalString production "--production"} install
fi
'';
# Builds and composes an NPM package including all its dependencies
buildNodePackage =
{ name
, packageName
, version
, dependencies ? []
, buildInputs ? []
, production ? true
, npmFlags ? ""
, dontNpmInstall ? false
, bypassCache ? false
, reconstructLock ? false
, preRebuild ? ""
, dontStrip ? true
, unpackPhase ? "true"
, buildPhase ? "true"
, ... }@args:
let
extraArgs = removeAttrs args [ "name" "dependencies" "buildInputs" "dontStrip" "dontNpmInstall" "preRebuild" "unpackPhase" "buildPhase" ];
in
stdenv.mkDerivation ({
name = "node_${name}-${version}";
buildInputs = [ tarWrapper python nodejs ]
++ stdenv.lib.optional (stdenv.isLinux) utillinux
++ stdenv.lib.optional (stdenv.isDarwin) libtool
++ buildInputs;
inherit nodejs;
inherit dontStrip; # Stripping may fail a build for some package deployments
inherit dontNpmInstall preRebuild unpackPhase buildPhase;
compositionScript = composePackage args;
pinpointDependenciesScript = pinpointDependenciesOfPackage args;
passAsFile = [ "compositionScript" "pinpointDependenciesScript" ];
installPhase = ''
# Create and enter a root node_modules/ folder
mkdir -p $out/lib/node_modules
cd $out/lib/node_modules
# Compose the package and all its dependencies
source $compositionScriptPath
${prepareAndInvokeNPM { inherit packageName bypassCache reconstructLock npmFlags production; }}
# Create symlink to the deployed executable folder, if applicable
if [ -d "$out/lib/node_modules/.bin" ]
then
ln -s $out/lib/node_modules/.bin $out/bin
fi
# Create symlinks to the deployed manual page folders, if applicable
if [ -d "$out/lib/node_modules/${packageName}/man" ]
then
mkdir -p $out/share
for dir in "$out/lib/node_modules/${packageName}/man/"*
do
mkdir -p $out/share/man/$(basename "$dir")
for page in "$dir"/*
do
ln -s $page $out/share/man/$(basename "$dir")
done
done
fi
# Run post install hook, if provided
runHook postInstall
'';
} // extraArgs);
# Builds a development shell
buildNodeShell =
{ name
, packageName
, version
, src
, dependencies ? []
, buildInputs ? []
, production ? true
, npmFlags ? ""
, dontNpmInstall ? false
, bypassCache ? false
, reconstructLock ? false
, dontStrip ? true
, unpackPhase ? "true"
, buildPhase ? "true"
, ... }@args:
let
extraArgs = removeAttrs args [ "name" "dependencies" "buildInputs" ];
nodeDependencies = stdenv.mkDerivation ({
name = "node-dependencies-${name}-${version}";
buildInputs = [ tarWrapper python nodejs ]
++ stdenv.lib.optional (stdenv.isLinux) utillinux
++ stdenv.lib.optional (stdenv.isDarwin) libtool
++ buildInputs;
inherit dontStrip; # Stripping may fail a build for some package deployments
inherit dontNpmInstall unpackPhase buildPhase;
includeScript = includeDependencies { inherit dependencies; };
pinpointDependenciesScript = pinpointDependenciesOfPackage args;
passAsFile = [ "includeScript" "pinpointDependenciesScript" ];
installPhase = ''
mkdir -p $out/${packageName}
cd $out/${packageName}
source $includeScriptPath
# Create fake package.json to make the npm commands work properly
cp ${src}/package.json .
chmod 644 package.json
${stdenv.lib.optionalString bypassCache ''
if [ -f ${src}/package-lock.json ]
then
cp ${src}/package-lock.json .
fi
''}
# Go to the parent folder to make sure that all packages are pinpointed
cd ..
${stdenv.lib.optionalString (builtins.substring 0 1 packageName == "@") "cd .."}
${prepareAndInvokeNPM { inherit packageName bypassCache reconstructLock npmFlags production; }}
# Expose the executables that were installed
cd ..
${stdenv.lib.optionalString (builtins.substring 0 1 packageName == "@") "cd .."}
mv ${packageName} lib
ln -s $out/lib/node_modules/.bin $out/bin
'';
} // extraArgs);
in
stdenv.mkDerivation {
name = "node-shell-${name}-${version}";
buildInputs = [ python nodejs ] ++ stdenv.lib.optional (stdenv.isLinux) utillinux ++ buildInputs;
buildCommand = ''
mkdir -p $out/bin
cat > $out/bin/shell <<EOF
#! ${stdenv.shell} -e
$shellHook
exec ${stdenv.shell}
EOF
chmod +x $out/bin/shell
'';
# Provide the dependencies in a development shell through the NODE_PATH environment variable
inherit nodeDependencies;
shellHook = stdenv.lib.optionalString (dependencies != []) ''
export NODE_PATH=${nodeDependencies}/lib/node_modules
export PATH="${nodeDependencies}/bin:$PATH"
'';
};
in
{
buildNodeSourceDist = stdenv.lib.makeOverridable buildNodeSourceDist;
buildNodePackage = stdenv.lib.makeOverridable buildNodePackage;
buildNodeShell = stdenv.lib.makeOverridable buildNodeShell;
}
+3
View File
@@ -0,0 +1,3 @@
[
"gherkin-lint"
]
+403
View File
@@ -0,0 +1,403 @@
# This file has been generated by node2nix 1.8.0. Do not edit!
{nodeEnv, fetchurl, fetchgit, globalBuildInputs ? []}:
let
sources = {
"@protobufjs/aspromise-1.1.2" = {
name = "_at_protobufjs_slash_aspromise";
packageName = "@protobufjs/aspromise";
version = "1.1.2";
src = fetchurl {
url = "https://registry.npmjs.org/@protobufjs/aspromise/-/aspromise-1.1.2.tgz";
sha1 = "9b8b0cc663d669a7d8f6f5d0893a14d348f30fbf";
};
};
"@protobufjs/base64-1.1.2" = {
name = "_at_protobufjs_slash_base64";
packageName = "@protobufjs/base64";
version = "1.1.2";
src = fetchurl {
url = "https://registry.npmjs.org/@protobufjs/base64/-/base64-1.1.2.tgz";
sha512 = "AZkcAA5vnN/v4PDqKyMR5lx7hZttPDgClv83E//FMNhR2TMcLUhfRUBHCmSl0oi9zMgDDqRUJkSxO3wm85+XLg==";
};
};
"@protobufjs/codegen-2.0.4" = {
name = "_at_protobufjs_slash_codegen";
packageName = "@protobufjs/codegen";
version = "2.0.4";
src = fetchurl {
url = "https://registry.npmjs.org/@protobufjs/codegen/-/codegen-2.0.4.tgz";
sha512 = "YyFaikqM5sH0ziFZCN3xDC7zeGaB/d0IUb9CATugHWbd1FRFwWwt4ld4OYMPWu5a3Xe01mGAULCdqhMlPl29Jg==";
};
};
"@protobufjs/eventemitter-1.1.0" = {
name = "_at_protobufjs_slash_eventemitter";
packageName = "@protobufjs/eventemitter";
version = "1.1.0";
src = fetchurl {
url = "https://registry.npmjs.org/@protobufjs/eventemitter/-/eventemitter-1.1.0.tgz";
sha1 = "355cbc98bafad5978f9ed095f397621f1d066b70";
};
};
"@protobufjs/fetch-1.1.0" = {
name = "_at_protobufjs_slash_fetch";
packageName = "@protobufjs/fetch";
version = "1.1.0";
src = fetchurl {
url = "https://registry.npmjs.org/@protobufjs/fetch/-/fetch-1.1.0.tgz";
sha1 = "ba99fb598614af65700c1619ff06d454b0d84c45";
};
};
"@protobufjs/float-1.0.2" = {
name = "_at_protobufjs_slash_float";
packageName = "@protobufjs/float";
version = "1.0.2";
src = fetchurl {
url = "https://registry.npmjs.org/@protobufjs/float/-/float-1.0.2.tgz";
sha1 = "5e9e1abdcb73fc0a7cb8b291df78c8cbd97b87d1";
};
};
"@protobufjs/inquire-1.1.0" = {
name = "_at_protobufjs_slash_inquire";
packageName = "@protobufjs/inquire";
version = "1.1.0";
src = fetchurl {
url = "https://registry.npmjs.org/@protobufjs/inquire/-/inquire-1.1.0.tgz";
sha1 = "ff200e3e7cf2429e2dcafc1140828e8cc638f089";
};
};
"@protobufjs/path-1.1.2" = {
name = "_at_protobufjs_slash_path";
packageName = "@protobufjs/path";
version = "1.1.2";
src = fetchurl {
url = "https://registry.npmjs.org/@protobufjs/path/-/path-1.1.2.tgz";
sha1 = "6cc2b20c5c9ad6ad0dccfd21ca7673d8d7fbf68d";
};
};
"@protobufjs/pool-1.1.0" = {
name = "_at_protobufjs_slash_pool";
packageName = "@protobufjs/pool";
version = "1.1.0";
src = fetchurl {
url = "https://registry.npmjs.org/@protobufjs/pool/-/pool-1.1.0.tgz";
sha1 = "09fd15f2d6d3abfa9b65bc366506d6ad7846ff54";
};
};
"@protobufjs/utf8-1.1.0" = {
name = "_at_protobufjs_slash_utf8";
packageName = "@protobufjs/utf8";
version = "1.1.0";
src = fetchurl {
url = "https://registry.npmjs.org/@protobufjs/utf8/-/utf8-1.1.0.tgz";
sha1 = "a777360b5b39a1a2e5106f8e858f2fd2d060c570";
};
};
"@types/long-4.0.1" = {
name = "_at_types_slash_long";
packageName = "@types/long";
version = "4.0.1";
src = fetchurl {
url = "https://registry.npmjs.org/@types/long/-/long-4.0.1.tgz";
sha512 = "5tXH6Bx/kNGd3MgffdmP4dy2Z+G4eaXw0SE81Tq3BNadtnMR5/ySMzX4SLEzHJzSmPNn4HIdpQsBvXMUykr58w==";
};
};
"@types/node-13.13.40" = {
name = "_at_types_slash_node";
packageName = "@types/node";
version = "13.13.40";
src = fetchurl {
url = "https://registry.npmjs.org/@types/node/-/node-13.13.40.tgz";
sha512 = "eKaRo87lu1yAXrzEJl0zcJxfUMDT5/mZalFyOkT44rnQps41eS2pfWzbaulSPpQLFNy29bFqn+Y5lOTL8ATlEQ==";
};
};
"@types/uuid-3.4.9" = {
name = "_at_types_slash_uuid";
packageName = "@types/uuid";
version = "3.4.9";
src = fetchurl {
url = "https://registry.npmjs.org/@types/uuid/-/uuid-3.4.9.tgz";
sha512 = "XDwyIlt/47l2kWLTzw/mtrpLdB+GPSskR2n/PIcPn+VYhVO77rGhRncIR5GPU0KRzXuqkDO+J5qqrG0Y8P6jzQ==";
};
};
"balanced-match-1.0.0" = {
name = "balanced-match";
packageName = "balanced-match";
version = "1.0.0";
src = fetchurl {
url = "https://registry.npmjs.org/balanced-match/-/balanced-match-1.0.0.tgz";
sha1 = "89b4d199ab2bee49de164ea02b89ce462d71b767";
};
};
"brace-expansion-1.1.11" = {
name = "brace-expansion";
packageName = "brace-expansion";
version = "1.1.11";
src = fetchurl {
url = "https://registry.npmjs.org/brace-expansion/-/brace-expansion-1.1.11.tgz";
sha512 = "iCuPHDFgrHX7H2vEI/5xpz07zSHB00TpugqhmYtVmMO6518mCuRMoOYFldEBl0g187ufozdaHgWKcYFb61qGiA==";
};
};
"buffer-from-1.1.1" = {
name = "buffer-from";
packageName = "buffer-from";
version = "1.1.1";
src = fetchurl {
url = "https://registry.npmjs.org/buffer-from/-/buffer-from-1.1.1.tgz";
sha512 = "MQcXEUbCKtEo7bhqEs6560Hyd4XaovZlO/k9V3hjVUF/zwW7KBVdSK4gIt/bzwS9MbR5qob+F5jusZsb0YQK2A==";
};
};
"commander-4.1.1" = {
name = "commander";
packageName = "commander";
version = "4.1.1";
src = fetchurl {
url = "https://registry.npmjs.org/commander/-/commander-4.1.1.tgz";
sha512 = "NOKm8xhkzAjzFx8B2v5OAHT+u5pRQc2UCa2Vq9jYL/31o2wi9mxBA7LIFs3sV5VSC49z6pEhfbMULvShKj26WA==";
};
};
"commander-5.0.0" = {
name = "commander";
packageName = "commander";
version = "5.0.0";
src = fetchurl {
url = "https://registry.npmjs.org/commander/-/commander-5.0.0.tgz";
sha512 = "JrDGPAKjMGSP1G0DUoaceEJ3DZgAfr/q6X7FVk4+U5KxUSKviYGM2k6zWkfyyBHy5rAtzgYJFa1ro2O9PtoxwQ==";
};
};
"concat-map-0.0.1" = {
name = "concat-map";
packageName = "concat-map";
version = "0.0.1";
src = fetchurl {
url = "https://registry.npmjs.org/concat-map/-/concat-map-0.0.1.tgz";
sha1 = "d8a96bd77fd68df7793a73036a3ba0d5405d477b";
};
};
"core-js-3.6.4" = {
name = "core-js";
packageName = "core-js";
version = "3.6.4";
src = fetchurl {
url = "https://registry.npmjs.org/core-js/-/core-js-3.6.4.tgz";
sha512 = "4paDGScNgZP2IXXilaffL9X7968RuvwlkK3xWtZRVqgd8SYNiVKRJvkFd1aqqEuPfN7E68ZHEp9hDj6lHj4Hyw==";
};
};
"cucumber-messages-8.0.0" = {
name = "cucumber-messages";
packageName = "cucumber-messages";
version = "8.0.0";
src = fetchurl {
url = "https://registry.npmjs.org/cucumber-messages/-/cucumber-messages-8.0.0.tgz";
sha512 = "lUnWRMjwA9+KhDec/5xRZV3Du67ISumHnVLywWQXyvzmc4P+Eqx8CoeQrBQoau3Pw1hs4kJLTDyV85hFBF00SQ==";
};
};
"fs.realpath-1.0.0" = {
name = "fs.realpath";
packageName = "fs.realpath";
version = "1.0.0";
src = fetchurl {
url = "https://registry.npmjs.org/fs.realpath/-/fs.realpath-1.0.0.tgz";
sha1 = "1504ad2523158caa40db4a2787cb01411994ea4f";
};
};
"gherkin-9.0.0" = {
name = "gherkin";
packageName = "gherkin";
version = "9.0.0";
src = fetchurl {
url = "https://registry.npmjs.org/gherkin/-/gherkin-9.0.0.tgz";
sha512 = "6xoAepoxo5vhkBXjB4RCfVnSKHu5z9SqXIQVUyj+Jw8BQX8odATlee5otXgdN8llZvyvHokuvNiBeB3naEnnIQ==";
};
};
"glob-7.1.6" = {
name = "glob";
packageName = "glob";
version = "7.1.6";
src = fetchurl {
url = "https://registry.npmjs.org/glob/-/glob-7.1.6.tgz";
sha512 = "LwaxwyZ72Lk7vZINtNNrywX0ZuLyStrdDtabefZKAY5ZGJhVtgdznluResxNmPitE0SAO+O26sWTHeKSI2wMBA==";
};
};
"inflight-1.0.6" = {
name = "inflight";
packageName = "inflight";
version = "1.0.6";
src = fetchurl {
url = "https://registry.npmjs.org/inflight/-/inflight-1.0.6.tgz";
sha1 = "49bd6331d7d02d0c09bc910a1075ba8165b56df9";
};
};
"inherits-2.0.4" = {
name = "inherits";
packageName = "inherits";
version = "2.0.4";
src = fetchurl {
url = "https://registry.npmjs.org/inherits/-/inherits-2.0.4.tgz";
sha512 = "k/vGaX4/Yla3WzyMCvTQOXYeIHvqOKtnqBduzTHpzpQZzAskKMhZ2K+EnBiSM9zGSoIFeMpXKxa4dYeZIQqewQ==";
};
};
"lodash-4.17.20" = {
name = "lodash";
packageName = "lodash";
version = "4.17.20";
src = fetchurl {
url = "https://registry.npmjs.org/lodash/-/lodash-4.17.20.tgz";
sha512 = "PlhdFcillOINfeV7Ni6oF1TAEayyZBoZ8bcshTHqOYJYlrqzRK5hagpagky5o4HfCzzd1TRkXPMFq6cKk9rGmA==";
};
};
"long-4.0.0" = {
name = "long";
packageName = "long";
version = "4.0.0";
src = fetchurl {
url = "https://registry.npmjs.org/long/-/long-4.0.0.tgz";
sha512 = "XsP+KhQif4bjX1kbuSiySJFNAehNxgLb6hPRGJ9QsUr8ajHkuXGdrHmFUTUUXhDwVX2R5bY4JNZEwbUiMhV+MA==";
};
};
"minimatch-3.0.4" = {
name = "minimatch";
packageName = "minimatch";
version = "3.0.4";
src = fetchurl {
url = "https://registry.npmjs.org/minimatch/-/minimatch-3.0.4.tgz";
sha512 = "yJHVQEhyqPLUTgt9B83PXu6W3rx4MvvHvSUvToogpwoGDOUQ+yDrR0HRot+yOCdCO7u4hX3pWft6kWBBcqh0UA==";
};
};
"once-1.4.0" = {
name = "once";
packageName = "once";
version = "1.4.0";
src = fetchurl {
url = "https://registry.npmjs.org/once/-/once-1.4.0.tgz";
sha1 = "583b1aa775961d4b113ac17d9c50baef9dd76bd1";
};
};
"path-is-absolute-1.0.1" = {
name = "path-is-absolute";
packageName = "path-is-absolute";
version = "1.0.1";
src = fetchurl {
url = "https://registry.npmjs.org/path-is-absolute/-/path-is-absolute-1.0.1.tgz";
sha1 = "174b9268735534ffbc7ace6bf53a5a9e1b5c5f5f";
};
};
"protobufjs-6.10.2" = {
name = "protobufjs";
packageName = "protobufjs";
version = "6.10.2";
src = fetchurl {
url = "https://registry.npmjs.org/protobufjs/-/protobufjs-6.10.2.tgz";
sha512 = "27yj+04uF6ya9l+qfpH187aqEzfCF4+Uit0I9ZBQVqK09hk/SQzKa2MUqUpXaVa7LOFRg1TSSr3lVxGOk6c0SQ==";
};
};
"source-map-0.6.1" = {
name = "source-map";
packageName = "source-map";
version = "0.6.1";
src = fetchurl {
url = "https://registry.npmjs.org/source-map/-/source-map-0.6.1.tgz";
sha512 = "UjgapumWlbMhkBgzT7Ykc5YXUT46F0iKu8SGXq0bcwP5dz/h0Plj6enJqjz1Zbq2l5WaqYnrVbwWOWMyF3F47g==";
};
};
"source-map-support-0.5.19" = {
name = "source-map-support";
packageName = "source-map-support";
version = "0.5.19";
src = fetchurl {
url = "https://registry.npmjs.org/source-map-support/-/source-map-support-0.5.19.tgz";
sha512 = "Wonm7zOCIJzBGQdB+thsPar0kYuCIzYvxZwlBa87yi/Mdjv7Tip2cyVbLj5o0cFPN4EVkuTwb3GDDyUx2DGnGw==";
};
};
"strip-json-comments-3.0.1" = {
name = "strip-json-comments";
packageName = "strip-json-comments";
version = "3.0.1";
src = fetchurl {
url = "https://registry.npmjs.org/strip-json-comments/-/strip-json-comments-3.0.1.tgz";
sha512 = "VTyMAUfdm047mwKl+u79WIdrZxtFtn+nBxHeb844XBQ9uMNTuTHdx2hc5RiAJYqwTj3wc/xe5HLSdJSkJ+WfZw==";
};
};
"uuid-3.4.0" = {
name = "uuid";
packageName = "uuid";
version = "3.4.0";
src = fetchurl {
url = "https://registry.npmjs.org/uuid/-/uuid-3.4.0.tgz";
sha512 = "HjSDRw6gZE5JMggctHBcjVak08+KEVhSIiDzFnT9S9aegmp85S/bReBVTb4QTFaRNptJ9kuYaNhnbNEOkbKb/A==";
};
};
"wrappy-1.0.2" = {
name = "wrappy";
packageName = "wrappy";
version = "1.0.2";
src = fetchurl {
url = "https://registry.npmjs.org/wrappy/-/wrappy-1.0.2.tgz";
sha1 = "b5243d8f3ec1aa35f1364605bc0d1036e30ab69f";
};
};
};
in
{
gherkin-lint = nodeEnv.buildNodePackage {
name = "gherkin-lint";
packageName = "gherkin-lint";
version = "4.1.3";
src = fetchurl {
url = "https://registry.npmjs.org/gherkin-lint/-/gherkin-lint-4.1.3.tgz";
sha512 = "5oagKEUqPgwKkJGtlqshy8mWNpWBRIFDeex63BOPF3+yC2GOMjdyvAHTQfHhkDqgwEdOpda2F8yGe1EBj5/dgw==";
};
dependencies = [
sources."@protobufjs/aspromise-1.1.2"
sources."@protobufjs/base64-1.1.2"
sources."@protobufjs/codegen-2.0.4"
sources."@protobufjs/eventemitter-1.1.0"
sources."@protobufjs/fetch-1.1.0"
sources."@protobufjs/float-1.0.2"
sources."@protobufjs/inquire-1.1.0"
sources."@protobufjs/path-1.1.2"
sources."@protobufjs/pool-1.1.0"
sources."@protobufjs/utf8-1.1.0"
sources."@types/long-4.0.1"
sources."@types/node-13.13.40"
sources."@types/uuid-3.4.9"
sources."balanced-match-1.0.0"
sources."brace-expansion-1.1.11"
sources."buffer-from-1.1.1"
sources."commander-5.0.0"
sources."concat-map-0.0.1"
sources."core-js-3.6.4"
sources."cucumber-messages-8.0.0"
sources."fs.realpath-1.0.0"
(sources."gherkin-9.0.0" // {
dependencies = [
sources."commander-4.1.1"
];
})
sources."glob-7.1.6"
sources."inflight-1.0.6"
sources."inherits-2.0.4"
sources."lodash-4.17.20"
sources."long-4.0.0"
sources."minimatch-3.0.4"
sources."once-1.4.0"
sources."path-is-absolute-1.0.1"
sources."protobufjs-6.10.2"
sources."source-map-0.6.1"
sources."source-map-support-0.5.19"
sources."strip-json-comments-3.0.1"
sources."uuid-3.4.0"
sources."wrappy-1.0.2"
];
buildInputs = globalBuildInputs;
meta = {
description = "A Gherkin linter/validator written in javascript";
homepage = "https://github.com/vsiakka/gherkin-lint#readme";
license = "ISC";
};
production = true;
bypassCache = true;
reconstructLock = true;
};
}
+37
View File
@@ -0,0 +1,37 @@
{ pkgs ? import (builtins.fetchTarball {
name = "nixos-20.09-2021-01-15";
url = "https://github.com/nixos/nixpkgs/archive/cd63096d6d887d689543a0b97743d28995bc9bc3.tar.gz";
sha256 = "1wg61h4gndm3vcprdcg7rc4s1v3jkm5xd7lw8r2f67w502y94gcy";
}) {} }:
let
environconfig = pkgs.python38Packages.buildPythonPackage rec {
pname = "environconfig";
version = "1.7.0";
src = pkgs.python38Packages.fetchPypi {
inherit pname version;
sha256 = "087amqnqsx7d816adszd1424kma1kx9lfnzffr140wvy7a50vi86";
};
meta = {
homepage = "https://github.com/buguroo/environconfig";
description = "Environment variables made easy";
};
};
pythonDependencies = [
pkgs.python38Packages.behave
pkgs.python38Packages.requests
environconfig
];
nodeDependencies = (pkgs.callPackage ./node-dependencies.nix {});
in
pkgs.mkShell {
buildInputs = [
pkgs.python38
pythonDependencies
pkgs.gnumake
pkgs.which
nodeDependencies.gherkin-lint
];
}