forked from CGM_Public/pretix_original
Compare commits
7 Commits
global_app
...
release/20
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
53cd098a25 | ||
|
|
bdd474a0f1 | ||
|
|
0b6b0829ed | ||
|
|
bedeaef7da | ||
|
|
669f622ce6 | ||
|
|
d166c7ee68 | ||
|
|
0282cfbdd5 |
6
.github/workflows/tests.yml
vendored
6
.github/workflows/tests.yml
vendored
@@ -35,7 +35,7 @@ jobs:
|
||||
- uses: actions/checkout@v2
|
||||
- uses: harmon758/postgresql-action@v1
|
||||
with:
|
||||
postgresql version: '15'
|
||||
postgresql version: '11'
|
||||
postgresql db: 'pretix'
|
||||
postgresql user: 'postgres'
|
||||
postgresql password: 'postgres'
|
||||
@@ -66,10 +66,6 @@ jobs:
|
||||
- name: Run tests
|
||||
working-directory: ./src
|
||||
run: PRETIX_CONFIG_FILE=tests/travis_${{ matrix.database }}.cfg py.test -n 3 -p no:sugar --cov=./ --cov-report=xml --reruns 3 tests --maxfail=100
|
||||
- name: Run concurrency tests
|
||||
working-directory: ./src
|
||||
run: PRETIX_CONFIG_FILE=tests/travis_${{ matrix.database }}.cfg py.test tests/concurrency_tests/ --reruns 0 --reuse-db
|
||||
if: matrix.database == 'postgres'
|
||||
- name: Upload coverage
|
||||
uses: codecov/codecov-action@v1
|
||||
with:
|
||||
|
||||
24
Dockerfile
24
Dockerfile
@@ -1,4 +1,4 @@
|
||||
FROM python:3.11-bookworm
|
||||
FROM python:3.11-bullseye
|
||||
|
||||
RUN apt-get update && \
|
||||
apt-get install -y --no-install-recommends \
|
||||
@@ -20,20 +20,20 @@ RUN apt-get update && \
|
||||
supervisor \
|
||||
libmaxminddb0 \
|
||||
libmaxminddb-dev \
|
||||
zlib1g-dev \
|
||||
nodejs \
|
||||
npm && \
|
||||
zlib1g-dev && \
|
||||
apt-get clean && \
|
||||
rm -rf /var/lib/apt/lists/* && \
|
||||
dpkg-reconfigure locales && \
|
||||
locale-gen C.UTF-8 && \
|
||||
/usr/sbin/update-locale LANG=C.UTF-8 && \
|
||||
dpkg-reconfigure locales && \
|
||||
locale-gen C.UTF-8 && \
|
||||
/usr/sbin/update-locale LANG=C.UTF-8 && \
|
||||
mkdir /etc/pretix && \
|
||||
mkdir /data && \
|
||||
useradd -ms /bin/bash -d /pretix -u 15371 pretixuser && \
|
||||
echo 'pretixuser ALL=(ALL) NOPASSWD:SETENV: /usr/bin/supervisord' >> /etc/sudoers && \
|
||||
mkdir /static && \
|
||||
mkdir /etc/supervisord
|
||||
mkdir /etc/supervisord && \
|
||||
curl -fsSL https://deb.nodesource.com/setup_16.x | sudo -E bash - && \
|
||||
apt-get install -y nodejs
|
||||
|
||||
|
||||
ENV LC_ALL=C.UTF-8 \
|
||||
@@ -63,10 +63,10 @@ RUN pip3 install -U \
|
||||
RUN chmod +x /usr/local/bin/pretix && \
|
||||
rm /etc/nginx/sites-enabled/default && \
|
||||
cd /pretix/src && \
|
||||
rm -f pretix.cfg && \
|
||||
mkdir -p data && \
|
||||
chown -R pretixuser:pretixuser /pretix /data data && \
|
||||
sudo -u pretixuser make production
|
||||
rm -f pretix.cfg && \
|
||||
mkdir -p data && \
|
||||
chown -R pretixuser:pretixuser /pretix /data data && \
|
||||
sudo -u pretixuser make production
|
||||
|
||||
USER pretixuser
|
||||
VOLUME ["/etc/pretix", "/data"]
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
from pretix.settings import *
|
||||
|
||||
LOGGING['handlers']['mail_admins']['include_html'] = True
|
||||
STORAGES["staticfiles"]["BACKEND"] = 'django.contrib.staticfiles.storage.ManifestStaticFilesStorage'
|
||||
STATICFILES_STORAGE = 'django.contrib.staticfiles.storage.ManifestStaticFilesStorage'
|
||||
|
||||
@@ -152,11 +152,6 @@ Example::
|
||||
password=abcd
|
||||
host=localhost
|
||||
port=3306
|
||||
advisory_lock_index=1
|
||||
sslmode=require
|
||||
sslrootcert=/etc/pretix/postgresql-ca.crt
|
||||
sslcert=/etc/pretix/postgresql-client-crt.crt
|
||||
sslkey=/etc/pretix/postgresql-client-key.key
|
||||
|
||||
``backend``
|
||||
One of ``sqlite3`` and ``postgresql``.
|
||||
@@ -168,17 +163,6 @@ Example::
|
||||
``user``, ``password``, ``host``, ``port``
|
||||
Connection details for the database connection. Empty by default.
|
||||
|
||||
``advisory_lock_index``
|
||||
On PostgreSQL, pretix uses the "advisory lock" feature. However, advisory locks use a server-wide name space and
|
||||
and are not scoped to a specific database. If you run multiple pretix applications with the same PostgreSQL server,
|
||||
you should set separate values for this setting (integers up to 256).
|
||||
|
||||
``sslmode``, ``sslrootcert``
|
||||
Connection TLS details for the PostgreSQL database connection. Possible values of ``sslmode`` are ``disable``, ``allow``, ``prefer``, ``require``, ``verify-ca``, and ``verify-full``. ``sslrootcert`` should be the accessible path of the ca certificate. Both values are empty by default.
|
||||
|
||||
``sslcert``, ``sslkey``
|
||||
Connection mTLS details for the PostgreSQL database connection. It's also necessary to specify ``sslmode`` and ``sslrootcert`` parameters, please check the correct values from the TLS part. ``sslcert`` should be the accessible path of the client certificate. ``sslkey`` should be the accessible path of the client key. All values are empty by default.
|
||||
|
||||
.. _`config-replica`:
|
||||
|
||||
Database replica settings
|
||||
@@ -340,10 +324,6 @@ to speed up various operations::
|
||||
["sentinel_host_3", 26379]
|
||||
]
|
||||
password=password
|
||||
ssl_cert_reqs=required
|
||||
ssl_ca_certs=/etc/pretix/redis-ca.pem
|
||||
ssl_keyfile=/etc/pretix/redis-client-crt.pem
|
||||
ssl_certfile=/etc/pretix/redis-client-key.key
|
||||
|
||||
``location``
|
||||
The location of redis, as a URL of the form ``redis://[:password]@localhost:6379/0``
|
||||
@@ -367,22 +347,6 @@ to speed up various operations::
|
||||
If your redis setup doesn't require a password or you already specified it in the location you can omit this option.
|
||||
If this is set it will be passed to redis as the connection option PASSWORD.
|
||||
|
||||
``ssl_cert_reqs``
|
||||
If this is set it will be passed to redis as the connection option ``SSL_CERT_REQS``.
|
||||
Possible values are ``none``, ``optional``, and ``required``.
|
||||
|
||||
``ssl_ca_certs``
|
||||
If your redis setup doesn't require TLS you can omit this option.
|
||||
If this is set it will be passed to redis as the connection option ``SSL_CA_CERTS``. Possible value is the ca path.
|
||||
|
||||
``ssl_keyfile``
|
||||
If your redis setup doesn't require mTLS you can omit this option.
|
||||
If this is set it will be passed to redis as the connection option ``SSL_KEYFILE``. Possible value is the keyfile path.
|
||||
|
||||
``ssl_certfile``
|
||||
If your redis setup doesn't require mTLS you can omit this option.
|
||||
If this is set it will be passed to redis as the connection option ``SSL_CERTFILE``. Possible value is the certfile path.
|
||||
|
||||
If redis is not configured, pretix will store sessions and locks in the database. If memcached
|
||||
is configured, memcached will be used for caching instead of redis.
|
||||
|
||||
@@ -432,8 +396,6 @@ The two ``transport_options`` entries can be omitted in most cases.
|
||||
If they are present they need to be a valid JSON dictionary.
|
||||
For possible entries in that dictionary see the `Celery documentation`_.
|
||||
|
||||
It is possible the use Redis with TLS/mTLS for the broker or the backend. To do so, it is necessary to specify the TLS identifier ``rediss``, the ssl mode ``ssl_cert_reqs`` and optionally specify the CA (TLS) ``ssl_ca_certs``, cert ``ssl_certfile`` and key ``ssl_keyfile`` (mTLS) path as encoded string. the following uri describes the format and possible parameters ``rediss://0.0.0.0:6379/1?ssl_cert_reqs=required&ssl_ca_certs=%2Fetc%2Fpretix%2Fredis-ca.pem&ssl_certfile=%2Fetc%2Fpretix%2Fredis-client-crt.pem&ssl_keyfile=%2Fetc%2Fpretix%2Fredis-client-key.key``
|
||||
|
||||
To use redis with sentinels set the broker or backend to ``sentinel://sentinel_host_1:26379;sentinel_host_2:26379/0``
|
||||
and the respective transport_options to ``{"master_name":"mymaster"}``.
|
||||
If your redis instances behind the sentinel have a password use ``sentinel://:my_password@sentinel_host_1:26379;sentinel_host_2:26379/0``.
|
||||
|
||||
@@ -68,7 +68,7 @@ generated key and installs the plugin from the URL we told you::
|
||||
mkdir -p /etc/ssh && \
|
||||
ssh-keyscan -t rsa -p 10022 code.rami.io >> /root/.ssh/known_hosts && \
|
||||
echo StrictHostKeyChecking=no >> /root/.ssh/config && \
|
||||
DJANGO_SETTINGS_MODULE= pip3 install -U "git+ssh://git@code.rami.io:10022/pretix/pretix-slack.git@stable#egg=pretix-slack" && \
|
||||
DJANGO_SETTINGS_MODULE=pretix.settings pip3 install -U "git+ssh://git@code.rami.io:10022/pretix/pretix-slack.git@stable#egg=pretix-slack" && \
|
||||
cd /pretix/src && \
|
||||
sudo -u pretixuser make production
|
||||
USER pretixuser
|
||||
|
||||
@@ -12,7 +12,7 @@ solution with many things readily set-up, look at :ref:`dockersmallscale`.
|
||||
get it right. If you're not feeling comfortable managing a Linux server, check out our hosting and service
|
||||
offers at `pretix.eu`_.
|
||||
|
||||
We tested this guide on the Linux distribution **Debian 12** but it should work very similar on other
|
||||
We tested this guide on the Linux distribution **Debian 11.6** but it should work very similar on other
|
||||
modern distributions, especially on all systemd-based ones.
|
||||
|
||||
Requirements
|
||||
@@ -64,7 +64,7 @@ Package dependencies
|
||||
|
||||
To build and run pretix, you will need the following debian packages::
|
||||
|
||||
# apt-get install git build-essential python3-dev python3-venv python3 python3-pip \
|
||||
# apt-get install git build-essential python-dev python3-venv python3 python3-pip \
|
||||
python3-dev libxml2-dev libxslt1-dev libffi-dev zlib1g-dev libssl-dev \
|
||||
gettext libpq-dev libjpeg-dev libopenjp2-7-dev
|
||||
|
||||
@@ -130,10 +130,9 @@ We now install pretix, its direct dependencies and gunicorn::
|
||||
|
||||
Note that you need Python 3.9 or newer. You can find out your Python version using ``python -V``.
|
||||
|
||||
We also need to create a data directory and allow your webserver to traverse to the root directory::
|
||||
We also need to create a data directory::
|
||||
|
||||
(venv)$ mkdir -p /var/pretix/data/media
|
||||
(venv)$ chmod +x /var/pretix
|
||||
|
||||
Finally, we compile static files and translation data and create the database structure::
|
||||
|
||||
@@ -249,14 +248,14 @@ The following snippet is an example on how to configure a nginx proxy for pretix
|
||||
}
|
||||
|
||||
location /static/ {
|
||||
alias /var/pretix/venv/lib/python3.11/site-packages/pretix/static.dist/;
|
||||
alias /var/pretix/venv/lib/python3.10/site-packages/pretix/static.dist/;
|
||||
access_log off;
|
||||
expires 365d;
|
||||
add_header Cache-Control "public";
|
||||
}
|
||||
}
|
||||
|
||||
.. note:: Remember to replace the ``python3.11`` in the ``/static/`` path in the config
|
||||
.. note:: Remember to replace the ``python3.10`` in the ``/static/`` path in the config
|
||||
above with your python version.
|
||||
|
||||
We recommend reading about setting `strong encryption settings`_ for your web server.
|
||||
|
||||
@@ -16,17 +16,12 @@ already upgraded to pretix 5.0 or later, downgrade back to the last 4.x release
|
||||
Update database schema
|
||||
----------------------
|
||||
|
||||
Before you start, make sure your database schema is up to date. With a local installation::
|
||||
Before you start, make sure your database schema is up to date::
|
||||
|
||||
# sudo -u pretix -s
|
||||
$ source /var/pretix/venv/bin/activate
|
||||
(venv)$ python -m pretix migrate
|
||||
|
||||
With a docker installation::
|
||||
|
||||
docker exec -it pretix.service pretix migrate
|
||||
|
||||
|
||||
Install PostgreSQL
|
||||
------------------
|
||||
|
||||
@@ -75,14 +70,10 @@ Of course, instead of all this you can also run a PostgreSQL docker container an
|
||||
Stop pretix
|
||||
-----------
|
||||
|
||||
To prevent any more changes to your data, stop pretix from running. With a local installation::
|
||||
To prevent any more changes to your data, stop pretix from running::
|
||||
|
||||
# systemctl stop pretix-web pretix-worker
|
||||
|
||||
With docker::
|
||||
|
||||
# systemctl stop pretix
|
||||
|
||||
Change configuration
|
||||
--------------------
|
||||
|
||||
@@ -99,16 +90,12 @@ Change the database configuration in your ``/etc/pretix/pretix.cfg`` file::
|
||||
Create database schema
|
||||
-----------------------
|
||||
|
||||
To create the schema in your new PostgreSQL database, use the following commands. With a local installation::
|
||||
To create the schema in your new PostgreSQL database, use the following commands::
|
||||
|
||||
# sudo -u pretix -s
|
||||
$ source /var/pretix/venv/bin/activate
|
||||
(venv)$ python -m pretix migrate
|
||||
|
||||
With docker::
|
||||
|
||||
# docker run --rm -v /var/pretix-data:/data -v /etc/pretix:/etc/pretix -v /var/run/redis:/var/run/redis pretix/standalone:stable migrate
|
||||
|
||||
|
||||
Migrate your data
|
||||
-----------------
|
||||
@@ -157,18 +144,11 @@ Afterwards, delete the file again::
|
||||
Start pretix
|
||||
------------
|
||||
|
||||
Stop your MySQL server as a verification step that you are no longer using it::
|
||||
Now, restart pretix. Maybe stop your MySQL server as a verification step that you are no longer using it::
|
||||
|
||||
# systemctl stop mariadb
|
||||
|
||||
Then, restart pretix. With a local installation::
|
||||
|
||||
# systemctl start pretix-web pretix-worker
|
||||
|
||||
With a docker installation::
|
||||
|
||||
# systemctl start pretix
|
||||
|
||||
And you're done! After you've verified everything has been copied correctly, you can delete the old MySQL database.
|
||||
|
||||
.. note:: Don't forget to update your backup process to back up your PostgreSQL database instead of your MySQL database now.
|
||||
|
||||
@@ -32,16 +32,10 @@ as well as the type of underlying hardware. Example:
|
||||
"token": "kpp4jn8g2ynzonp6",
|
||||
"hardware_brand": "Samsung",
|
||||
"hardware_model": "Galaxy S",
|
||||
"os_name": "Android",
|
||||
"os_version": "2.3.6",
|
||||
"software_brand": "pretixdroid",
|
||||
"software_version": "4.0.0",
|
||||
"rsa_pubkey": "-----BEGIN PUBLIC KEY-----\nMIIBIjANBgkqh…nswIDAQAB\n-----END PUBLIC KEY-----\n"
|
||||
"software_version": "4.0.0"
|
||||
}
|
||||
|
||||
The ``rsa_pubkey`` is optional any only required for certain fatures such as working with reusable
|
||||
media and NFC cryptography.
|
||||
|
||||
Every initialization token can only be used once. On success, you will receive a response containing
|
||||
information on your device as well as your API token:
|
||||
|
||||
@@ -104,8 +98,6 @@ following endpoint:
|
||||
{
|
||||
"hardware_brand": "Samsung",
|
||||
"hardware_model": "Galaxy S",
|
||||
"os_name": "Android",
|
||||
"os_version": "2.3.6",
|
||||
"software_brand": "pretixdroid",
|
||||
"software_version": "4.1.0",
|
||||
"info": {"arbitrary": "data"}
|
||||
@@ -141,29 +133,9 @@ The response will look like this:
|
||||
"id": 3,
|
||||
"name": "South entrance"
|
||||
}
|
||||
},
|
||||
"server": {
|
||||
"version": {
|
||||
"pretix": "3.6.0.dev0",
|
||||
"pretix_numeric": 30060001000
|
||||
}
|
||||
},
|
||||
"medium_key_sets": [
|
||||
{
|
||||
"public_id": 3456349,
|
||||
"organizer": "foo",
|
||||
"active": true,
|
||||
"media_type": "nfc_mf0aes",
|
||||
"uid_key": "base64-encoded-encrypted-key",
|
||||
"diversification_key": "base64-encoded-encrypted-key",
|
||||
}
|
||||
]
|
||||
}
|
||||
}
|
||||
|
||||
``"medium_key_sets`` will always be empty if you did not set an ``rsa_pubkey``.
|
||||
The individual keys in the key sets are encrypted with the device's ``rsa_pubkey``
|
||||
using ``RSA/ECB/PKCS1Padding``.
|
||||
|
||||
Creating a new API key
|
||||
----------------------
|
||||
|
||||
|
||||
@@ -24,8 +24,6 @@ all_events boolean Whether this de
|
||||
limit_events list List of event slugs this device has access to
|
||||
hardware_brand string Device hardware manufacturer (read-only)
|
||||
hardware_model string Device hardware model (read-only)
|
||||
os_name string Device operating system name (read-only)
|
||||
os_version string Device operating system version (read-only)
|
||||
software_brand string Device software product (read-only)
|
||||
software_version string Device software version (read-only)
|
||||
created datetime Creation time
|
||||
@@ -78,8 +76,6 @@ Device endpoints
|
||||
"security_profile": "full",
|
||||
"hardware_brand": "Zebra",
|
||||
"hardware_model": "TC25",
|
||||
"os_name": "Android",
|
||||
"os_version": "8.1.0",
|
||||
"software_brand": "pretixSCAN",
|
||||
"software_version": "1.5.1"
|
||||
}
|
||||
@@ -127,8 +123,6 @@ Device endpoints
|
||||
"security_profile": "full",
|
||||
"hardware_brand": "Zebra",
|
||||
"hardware_model": "TC25",
|
||||
"os_name": "Android",
|
||||
"os_version": "8.1.0",
|
||||
"software_brand": "pretixSCAN",
|
||||
"software_version": "1.5.1"
|
||||
}
|
||||
@@ -179,8 +173,6 @@ Device endpoints
|
||||
"initialized": null
|
||||
"hardware_brand": null,
|
||||
"hardware_model": null,
|
||||
"os_name": null,
|
||||
"os_version": null,
|
||||
"software_brand": null,
|
||||
"software_version": null
|
||||
}
|
||||
|
||||
@@ -31,9 +31,9 @@ subevent_mode strings Determines h
|
||||
``"same"`` (discount is only applied for groups within
|
||||
the same date), or ``"distinct"`` (discount is only applied
|
||||
for groups with no two same dates).
|
||||
condition_all_products boolean If ``true``, the discount condition applies to all items.
|
||||
condition_all_products boolean If ``true``, the discount applies to all items.
|
||||
condition_limit_products list of integers If ``condition_all_products`` is not set, this is a list
|
||||
of internal item IDs that the discount condition applies to.
|
||||
of internal item IDs that the discount applies to.
|
||||
condition_apply_to_addons boolean If ``true``, the discount applies to add-on products as well,
|
||||
otherwise it only applies to top-level items. The discount never
|
||||
applies to bundled products.
|
||||
@@ -48,17 +48,6 @@ benefit_discount_matching_percent decimal (string) The percenta
|
||||
benefit_only_apply_to_cheapest_n_matches integer If set higher than 0, the discount will only be applied to
|
||||
the cheapest matches. Useful for a "3 for 2"-style discount.
|
||||
Cannot be combined with ``condition_min_value``.
|
||||
benefit_same_products boolean If ``true``, the discount benefit applies to the same set of items
|
||||
as the condition (see above).
|
||||
benefit_limit_products list of integers If ``benefit_same_products`` is not set, this is a list
|
||||
of internal item IDs that the discount benefit applies to.
|
||||
benefit_apply_to_addons boolean (Only used if ``benefit_same_products`` is ``false``.)
|
||||
If ``true``, the discount applies to add-on products as well,
|
||||
otherwise it only applies to top-level items. The discount never
|
||||
applies to bundled products.
|
||||
benefit_ignore_voucher_discounted boolean (Only used if ``benefit_same_products`` is ``false``.)
|
||||
If ``true``, the discount does not apply to products which have
|
||||
been discounted by a voucher.
|
||||
======================================== ========================== =======================================================
|
||||
|
||||
|
||||
@@ -105,10 +94,6 @@ Endpoints
|
||||
"condition_ignore_voucher_discounted": false,
|
||||
"condition_min_count": 3,
|
||||
"condition_min_value": "0.00",
|
||||
"benefit_same_products": true,
|
||||
"benefit_limit_products": [],
|
||||
"benefit_apply_to_addons": true,
|
||||
"benefit_ignore_voucher_discounted": false,
|
||||
"benefit_discount_matching_percent": "100.00",
|
||||
"benefit_only_apply_to_cheapest_n_matches": 1
|
||||
}
|
||||
@@ -161,10 +146,6 @@ Endpoints
|
||||
"condition_ignore_voucher_discounted": false,
|
||||
"condition_min_count": 3,
|
||||
"condition_min_value": "0.00",
|
||||
"benefit_same_products": true,
|
||||
"benefit_limit_products": [],
|
||||
"benefit_apply_to_addons": true,
|
||||
"benefit_ignore_voucher_discounted": false,
|
||||
"benefit_discount_matching_percent": "100.00",
|
||||
"benefit_only_apply_to_cheapest_n_matches": 1
|
||||
}
|
||||
@@ -203,10 +184,6 @@ Endpoints
|
||||
"condition_ignore_voucher_discounted": false,
|
||||
"condition_min_count": 3,
|
||||
"condition_min_value": "0.00",
|
||||
"benefit_same_products": true,
|
||||
"benefit_limit_products": [],
|
||||
"benefit_apply_to_addons": true,
|
||||
"benefit_ignore_voucher_discounted": false,
|
||||
"benefit_discount_matching_percent": "100.00",
|
||||
"benefit_only_apply_to_cheapest_n_matches": 1
|
||||
}
|
||||
@@ -234,10 +211,6 @@ Endpoints
|
||||
"condition_ignore_voucher_discounted": false,
|
||||
"condition_min_count": 3,
|
||||
"condition_min_value": "0.00",
|
||||
"benefit_same_products": true,
|
||||
"benefit_limit_products": [],
|
||||
"benefit_apply_to_addons": true,
|
||||
"benefit_ignore_voucher_discounted": false,
|
||||
"benefit_discount_matching_percent": "100.00",
|
||||
"benefit_only_apply_to_cheapest_n_matches": 1
|
||||
}
|
||||
@@ -294,10 +267,6 @@ Endpoints
|
||||
"condition_ignore_voucher_discounted": false,
|
||||
"condition_min_count": 3,
|
||||
"condition_min_value": "0.00",
|
||||
"benefit_same_products": true,
|
||||
"benefit_limit_products": [],
|
||||
"benefit_apply_to_addons": true,
|
||||
"benefit_ignore_voucher_discounted": false,
|
||||
"benefit_discount_matching_percent": "100.00",
|
||||
"benefit_only_apply_to_cheapest_n_matches": 1
|
||||
}
|
||||
|
||||
@@ -12,7 +12,6 @@ The invoice resource contains the following public fields:
|
||||
Field Type Description
|
||||
===================================== ========================== =======================================================
|
||||
number string Invoice number (with prefix)
|
||||
event string The slug of the parent event
|
||||
order string Order code of the order this invoice belongs to
|
||||
is_cancellation boolean ``true``, if this invoice is the cancellation of a
|
||||
different invoice.
|
||||
@@ -122,13 +121,9 @@ internal_reference string Customer's refe
|
||||
|
||||
The attribute ``lines.subevent`` has been added.
|
||||
|
||||
.. versionchanged:: 2023.8
|
||||
|
||||
The ``event`` attribute has been added. The organizer-level endpoint has been added.
|
||||
|
||||
|
||||
List of all invoices
|
||||
--------------------
|
||||
Endpoints
|
||||
---------
|
||||
|
||||
.. http:get:: /api/v1/organizers/(organizer)/events/(event)/invoices/
|
||||
|
||||
@@ -157,7 +152,6 @@ List of all invoices
|
||||
"results": [
|
||||
{
|
||||
"number": "SAMPLECONF-00001",
|
||||
"event": "sampleconf",
|
||||
"order": "ABC12",
|
||||
"is_cancellation": false,
|
||||
"invoice_from_name": "Big Events LLC",
|
||||
@@ -227,50 +221,6 @@ List of all invoices
|
||||
:statuscode 401: Authentication failure
|
||||
:statuscode 403: The requested organizer/event does not exist **or** you have no permission to view this resource.
|
||||
|
||||
.. http:get:: /api/v1/organizers/(organizer)/invoices/
|
||||
|
||||
Returns a list of all invoices within all events of a given organizer (with sufficient access permissions).
|
||||
|
||||
Supported query parameters and output format of this endpoint are identical to the list endpoint within an event.
|
||||
|
||||
**Example request**:
|
||||
|
||||
.. sourcecode:: http
|
||||
|
||||
GET /api/v1/organizers/bigevents/events/sampleconf/invoices/ HTTP/1.1
|
||||
Host: pretix.eu
|
||||
Accept: application/json, text/javascript
|
||||
|
||||
**Example response**:
|
||||
|
||||
.. sourcecode:: http
|
||||
|
||||
HTTP/1.1 200 OK
|
||||
Vary: Accept
|
||||
Content-Type: application/json
|
||||
|
||||
{
|
||||
"count": 1,
|
||||
"next": null,
|
||||
"previous": null,
|
||||
"results": [
|
||||
{
|
||||
"number": "SAMPLECONF-00001",
|
||||
"event": "sampleconf",
|
||||
"order": "ABC12",
|
||||
...
|
||||
]
|
||||
}
|
||||
|
||||
:param organizer: The ``slug`` field of the organizer to fetch
|
||||
:statuscode 200: no error
|
||||
:statuscode 401: Authentication failure
|
||||
:statuscode 403: The requested organizer/event does not exist **or** you have no permission to view this resource.
|
||||
|
||||
|
||||
Fetching individual invoices
|
||||
----------------------------
|
||||
|
||||
.. http:get:: /api/v1/organizers/(organizer)/events/(event)/invoices/(number)/
|
||||
|
||||
Returns information on one invoice, identified by its invoice number.
|
||||
@@ -293,7 +243,6 @@ Fetching individual invoices
|
||||
|
||||
{
|
||||
"number": "SAMPLECONF-00001",
|
||||
"event": "sampleconf",
|
||||
"order": "ABC12",
|
||||
"is_cancellation": false,
|
||||
"invoice_from_name": "Big Events LLC",
|
||||
@@ -388,12 +337,6 @@ Fetching individual invoices
|
||||
:statuscode 409: The file is not yet ready and will now be prepared. Retry the request after waiting for a few
|
||||
seconds.
|
||||
|
||||
|
||||
Modifying invoices
|
||||
------------------
|
||||
|
||||
Invoices cannot be edited directly, but the following actions can be triggered:
|
||||
|
||||
.. http:post:: /api/v1/organizers/(organizer)/events/(event)/invoices/(invoice_no)/reissue/
|
||||
|
||||
Cancels the invoice and creates a new one.
|
||||
|
||||
@@ -20,7 +20,6 @@ The order resource contains the following public fields:
|
||||
Field Type Description
|
||||
===================================== ========================== =======================================================
|
||||
code string Order code
|
||||
event string The slug of the parent event
|
||||
status string Order status, one of:
|
||||
|
||||
* ``n`` – pending
|
||||
@@ -131,10 +130,6 @@ last_modified datetime Last modificati
|
||||
|
||||
The ``valid_if_pending`` attribute has been added.
|
||||
|
||||
.. versionchanged:: 2023.8
|
||||
|
||||
The ``event`` attribute has been added. The organizer-level endpoint has been added.
|
||||
|
||||
|
||||
.. _order-position-resource:
|
||||
|
||||
@@ -294,7 +289,6 @@ List of all orders
|
||||
"results": [
|
||||
{
|
||||
"code": "ABC12",
|
||||
"event": "sampleconf",
|
||||
"status": "p",
|
||||
"testmode": false,
|
||||
"secret": "k24fiuwvu8kxz3y1",
|
||||
@@ -447,48 +441,6 @@ List of all orders
|
||||
:statuscode 401: Authentication failure
|
||||
:statuscode 403: The requested organizer/event does not exist **or** you have no permission to view this resource.
|
||||
|
||||
.. http:get:: /api/v1/organizers/(organizer)/orders/
|
||||
|
||||
Returns a list of all orders within all events of a given organizer (with sufficient access permissions).
|
||||
|
||||
Supported query parameters and output format of this endpoint are identical to the list endpoint within an event,
|
||||
with the exception that the ``pdf_data`` parameter is not supported here.
|
||||
|
||||
**Example request**:
|
||||
|
||||
.. sourcecode:: http
|
||||
|
||||
GET /api/v1/organizers/bigevents/orders/ HTTP/1.1
|
||||
Host: pretix.eu
|
||||
Accept: application/json, text/javascript
|
||||
|
||||
**Example response**:
|
||||
|
||||
.. sourcecode:: http
|
||||
|
||||
HTTP/1.1 200 OK
|
||||
Vary: Accept
|
||||
Content-Type: application/json
|
||||
X-Page-Generated: 2017-12-01T10:00:00Z
|
||||
|
||||
{
|
||||
"count": 1,
|
||||
"next": null,
|
||||
"previous": null,
|
||||
"results": [
|
||||
{
|
||||
"code": "ABC12",
|
||||
"event": "sampleconf",
|
||||
...
|
||||
}
|
||||
]
|
||||
}
|
||||
|
||||
:param organizer: The ``slug`` field of the organizer to fetch
|
||||
:statuscode 200: no error
|
||||
:statuscode 401: Authentication failure
|
||||
:statuscode 403: The requested organizer/event does not exist **or** you have no permission to view this resource.
|
||||
|
||||
Fetching individual orders
|
||||
--------------------------
|
||||
|
||||
@@ -514,7 +466,6 @@ Fetching individual orders
|
||||
|
||||
{
|
||||
"code": "ABC12",
|
||||
"event": "sampleconf",
|
||||
"status": "p",
|
||||
"testmode": false,
|
||||
"secret": "k24fiuwvu8kxz3y1",
|
||||
|
||||
@@ -18,7 +18,7 @@ The reusable medium resource contains the following public fields:
|
||||
Field Type Description
|
||||
===================================== ========================== =======================================================
|
||||
id integer Internal ID of the medium
|
||||
type string Type of medium, e.g. ``"barcode"``, ``"nfc_uid"`` or ``"nfc_mf0aes"``.
|
||||
type string Type of medium, e.g. ``"barcode"`` or ``"nfc_uid"``.
|
||||
organizer string Organizer slug of the organizer who "owns" this medium.
|
||||
identifier string Unique identifier of the medium. The format depends on the ``type``.
|
||||
active boolean Whether this medium may be used.
|
||||
@@ -37,7 +37,6 @@ Existing media types are:
|
||||
|
||||
- ``barcode``
|
||||
- ``nfc_uid``
|
||||
- ``nfc_mf0aes``
|
||||
|
||||
Endpoints
|
||||
---------
|
||||
|
||||
@@ -1,10 +1,10 @@
|
||||
Scheduled email rules
|
||||
Automated email rules
|
||||
=====================
|
||||
|
||||
Resource description
|
||||
--------------------
|
||||
|
||||
Scheduled email rules that specify emails that the system will send automatically at a specific point in time, e.g.
|
||||
Automated email rules that specify emails that the system will send automatically at a specific point in time, e.g.
|
||||
the day of the event.
|
||||
|
||||
.. rst-class:: rest-resource-table
|
||||
@@ -18,19 +18,8 @@ subject multi-lingual string The subject of
|
||||
template multi-lingual string The body of the email
|
||||
all_products boolean If ``true``, the email is sent to buyers of all products
|
||||
limit_products list of integers List of product IDs, if ``all_products`` is not set
|
||||
[**DEPRECATED**] include_pending boolean If ``true``, the email is sent to pending orders. If ``false``,
|
||||
include_pending boolean If ``true``, the email is sent to pending orders. If ``false``,
|
||||
only paid orders are considered.
|
||||
restrict_to_status list List of order states to restrict recipients to. Valid
|
||||
entries are ``p`` for paid, ``e`` for expired, ``c`` for canceled,
|
||||
``n__pending_approval`` for pending approval,
|
||||
``n__not_pending_approval_and_not_valid_if_pending`` for payment
|
||||
pending, ``n__valid_if_pending`` for payment pending but already confirmed,
|
||||
and ``n__pending_overdue`` for pending with payment overdue.
|
||||
The default is ``["p", "n__valid_if_pending"]``.
|
||||
checked_in_status string Check-in status to restrict recipients to. Valid strings are:
|
||||
``null`` for no filtering (default), ``checked_in`` for
|
||||
limiting to attendees that are or have been checked in, and
|
||||
``no_checkin`` for limiting to attendees who have not checked in.
|
||||
date_is_absolute boolean If ``true``, the email is set at a specific point in time.
|
||||
send_date datetime If ``date_is_absolute`` is set: Date and time to send the email.
|
||||
send_offset_days integer If ``date_is_absolute`` is not set, this is the number of days
|
||||
@@ -48,10 +37,7 @@ send_to string Can be ``"order
|
||||
or ``"both"``.
|
||||
date. Otherwise it is relative to the event start date.
|
||||
===================================== ========================== =======================================================
|
||||
.. versionchanged:: 2023.7
|
||||
|
||||
The ``include_pending`` field has been deprecated.
|
||||
The ``restrict_to_status`` field has been added.
|
||||
|
||||
Endpoints
|
||||
---------
|
||||
@@ -88,12 +74,7 @@ Endpoints
|
||||
"template": {"en": "Don't forget your tickets, download them at {url}"},
|
||||
"all_products": true,
|
||||
"limit_products": [],
|
||||
"restrict_to_status": [
|
||||
"p",
|
||||
"n__not_pending_approval_and_not_valid_if_pending",
|
||||
"n__valid_if_pending"
|
||||
],
|
||||
"checked_in_status": null,
|
||||
"include_pending": false,
|
||||
"send_date": null,
|
||||
"send_offset_days": 1,
|
||||
"send_offset_time": "18:00",
|
||||
@@ -139,12 +120,7 @@ Endpoints
|
||||
"template": {"en": "Don't forget your tickets, download them at {url}"},
|
||||
"all_products": true,
|
||||
"limit_products": [],
|
||||
"restrict_to_status": [
|
||||
"p",
|
||||
"n__not_pending_approval_and_not_valid_if_pending",
|
||||
"n__valid_if_pending"
|
||||
],
|
||||
"checked_in_status": null,
|
||||
"include_pending": false,
|
||||
"send_date": null,
|
||||
"send_offset_days": 1,
|
||||
"send_offset_time": "18:00",
|
||||
@@ -181,12 +157,7 @@ Endpoints
|
||||
"template": {"en": "Don't forget your tickets, download them at {url}"},
|
||||
"all_products": true,
|
||||
"limit_products": [],
|
||||
"restrict_to_status": [
|
||||
"p",
|
||||
"n__not_pending_approval_and_not_valid_if_pending",
|
||||
"n__valid_if_pending"
|
||||
],
|
||||
"checked_in_status": "checked_in",
|
||||
"include_pending": false,
|
||||
"send_date": null,
|
||||
"send_offset_days": 1,
|
||||
"send_offset_time": "18:00",
|
||||
@@ -211,12 +182,7 @@ Endpoints
|
||||
"template": {"en": "Don't forget your tickets, download them at {url}"},
|
||||
"all_products": true,
|
||||
"limit_products": [],
|
||||
"restrict_to_status": [
|
||||
"p",
|
||||
"n__not_pending_approval_and_not_valid_if_pending",
|
||||
"n__valid_if_pending"
|
||||
],
|
||||
"checked_in_status": "checked_in",
|
||||
"include_pending": false,
|
||||
"send_date": null,
|
||||
"send_offset_days": 1,
|
||||
"send_offset_time": "18:00",
|
||||
@@ -269,12 +235,7 @@ Endpoints
|
||||
"template": {"en": "Don't forget your tickets, download them at {url}"},
|
||||
"all_products": true,
|
||||
"limit_products": [],
|
||||
"restrict_to_status": [
|
||||
"p",
|
||||
"n__not_pending_approval_and_not_valid_if_pending",
|
||||
"n__valid_if_pending"
|
||||
],
|
||||
"checked_in_status": "checked_in",
|
||||
"include_pending": false,
|
||||
"send_date": null,
|
||||
"send_offset_days": 1,
|
||||
"send_offset_time": "18:00",
|
||||
|
||||
@@ -68,10 +68,6 @@ last_modified datetime Last modificati
|
||||
The ``date_from_before``, ``date_from_after``, ``date_to_before``, and ``date_to_after`` query parameters have been
|
||||
added.
|
||||
|
||||
.. versionchanged:: 2023.8.0
|
||||
|
||||
For the organizer-wide endpoint, the ``search`` query parameter has been modified to filter sub-events by their parent events slug too.
|
||||
|
||||
Endpoints
|
||||
---------
|
||||
|
||||
@@ -476,7 +472,6 @@ Endpoints
|
||||
:query date_to_after: If set to a date and time, only events that have an end date and end at or after the given time are returned.
|
||||
:query date_to_before: If set to a date and time, only events that have an end date and end at or before the given time are returned.
|
||||
:query ends_after: If set to a date and time, only events that happen during of after the given time are returned.
|
||||
:query search: Only return events matching a given search query.
|
||||
:query sales_channel: If set to a sales channel identifier, the response will only contain subevents from events available on this sales channel.
|
||||
:param organizer: The ``slug`` field of a valid organizer
|
||||
:param event: The ``slug`` field of the event to fetch
|
||||
|
||||
@@ -67,9 +67,6 @@ The following values for ``action_types`` are valid with pretix core:
|
||||
* ``pretix.event.live.deactivated``
|
||||
* ``pretix.event.testmode.activated``
|
||||
* ``pretix.event.testmode.deactivated``
|
||||
* ``pretix.customer.created``
|
||||
* ``pretix.customer.changed``
|
||||
* ``pretix.customer.anonymized``
|
||||
|
||||
Installed plugins might register more valid values.
|
||||
|
||||
|
||||
@@ -61,7 +61,7 @@ Backend
|
||||
item_formsets, order_search_filter_q, order_search_forms
|
||||
|
||||
.. automodule:: pretix.base.signals
|
||||
:members: logentry_display, logentry_object_link, requiredaction_display, timeline_events, orderposition_blocked_display, customer_created, customer_signed_in
|
||||
:members: logentry_display, logentry_object_link, requiredaction_display, timeline_events, orderposition_blocked_display
|
||||
|
||||
Vouchers
|
||||
""""""""
|
||||
|
||||
@@ -70,8 +70,6 @@ The provider class
|
||||
|
||||
.. autoattribute:: settings_form_fields
|
||||
|
||||
.. autoattribute:: walletqueries
|
||||
|
||||
.. automethod:: settings_form_clean
|
||||
|
||||
.. automethod:: settings_content_render
|
||||
|
||||
@@ -37,7 +37,7 @@ you to execute a piece of code with a different locale:
|
||||
This is very useful e.g. when sending an email to a user that has a different language than the user performing the
|
||||
action that causes the mail to be sent.
|
||||
|
||||
.. _translation features: https://docs.djangoproject.com/en/4.2/topics/i18n/translation/
|
||||
.. _translation features: https://docs.djangoproject.com/en/1.9/topics/i18n/translation/
|
||||
.. _GNU gettext: https://www.gnu.org/software/gettext/
|
||||
.. _strings: https://django-i18nfield.readthedocs.io/en/latest/strings.html
|
||||
.. _database fields: https://django-i18nfield.readthedocs.io/en/latest/quickstart.html
|
||||
|
||||
@@ -18,4 +18,3 @@ Contents:
|
||||
email
|
||||
permissions
|
||||
logging
|
||||
locking
|
||||
|
||||
@@ -1,69 +0,0 @@
|
||||
.. highlight:: python
|
||||
|
||||
Resource locking
|
||||
================
|
||||
|
||||
.. versionchanged:: 2023.8
|
||||
|
||||
Our locking mechanism changed heavily in version 2023.8. Read `this PR`_ for background information.
|
||||
|
||||
One of pretix's core objectives as a ticketing system could be described as the management of scarce resources.
|
||||
Specifically, the following types of scarce-ness exist in pretix:
|
||||
|
||||
- Quotas can limit the number of tickets available
|
||||
- Seats can only be booked once
|
||||
- Vouchers can only be used a limited number of times
|
||||
- Some memberships can only be used a limited number of times
|
||||
|
||||
For all of these, it is critical that we prevent race conditions.
|
||||
While for some events it wouldn't be a big deal to sell a ticket more or less, for some it would be problematic and selling the same seat twice would always be catastrophic.
|
||||
|
||||
We therefore implement a standardized locking approach across the system to limit concurrency in cases where it could
|
||||
be problematic.
|
||||
|
||||
To acquire a lock on a set of quotas to create a new order that uses that quota, you should follow the following pattern::
|
||||
|
||||
with transaction.atomic(durable=True):
|
||||
quotas = Quota.objects.filter(...)
|
||||
lock_objects(quotas, shared_lock_objects=[event])
|
||||
check_quota(quotas)
|
||||
create_ticket()
|
||||
|
||||
The lock will automatically be released at the end of your database transaction.
|
||||
|
||||
Generally, follow the following guidelines during your development:
|
||||
|
||||
- **Always** acquire a lock on every **quota**, **voucher** or **seat** that you "use" during your transaction. "Use"
|
||||
here means any action after which the quota, voucher or seat will be **less available**, such as creating a cart
|
||||
position, creating an order, creating a blocking voucher, etc.
|
||||
|
||||
- There is **no need** to acquire a lock if you **free up** capacity, e.g. by canceling an order, deleting a voucher, etc.
|
||||
|
||||
- **Always** acquire a shared lock on the ``event`` you are working in whenever you acquire a lock on a quota, voucher,
|
||||
or seat.
|
||||
|
||||
- Only call ``lock_objects`` **once** per transaction. If you violate this rule, `deadlocks`_ become possible.
|
||||
|
||||
- For best performance, call ``lock_objects`` as **late** in your transaction as possible, but always before you check
|
||||
if the desired resource is still available in sufficient quantity.
|
||||
|
||||
Behind the scenes, the locking is implemented through `PostgreSQL advisory locks`_. You should also be aware of the following
|
||||
properties of our system:
|
||||
|
||||
- In some situations, an exclusive lock on the ``event`` is used, such as when the system can't determine for sure which
|
||||
seats will become unavailable after the transaction.
|
||||
|
||||
- An exclusive lock on the event is also used if you pass more than 20 objects to ``lock_objects``. This is a performance
|
||||
trade-off because it would take long to acquire all of the individual locks.
|
||||
|
||||
- If ``lock_objects`` is unable to acquire a lock within 3 seconds, a ``LockTimeoutException`` will be thrown.
|
||||
|
||||
.. note::
|
||||
|
||||
We currently do not use ``lock_objects`` for memberships. Instead, we use ``select_for_update()`` on the membership
|
||||
model. This might change in the future, but you should usually not be concerned about it since
|
||||
``validate_memberships_in_order(lock=True)`` will handle it for you.
|
||||
|
||||
.. _this PR: https://github.com/pretix/pretix/pull/2408
|
||||
.. _deadlocks: https://www.postgresql.org/docs/current/explicit-locking.html#LOCKING-DEADLOCKS
|
||||
.. _PostgreSQL advisory locks: https://www.postgresql.org/docs/11/explicit-locking.html#ADVISORY-LOCKS
|
||||
@@ -15,33 +15,25 @@ and the admin panel is available at ``https://pretix.eu/control/event/bigorg/awe
|
||||
|
||||
If the organizer now configures a custom domain like ``tickets.bigorg.com``, his event will
|
||||
from now on be available on ``https://tickets.bigorg.com/awesomecon/``. The former URL at
|
||||
``pretix.eu`` will redirect there. It's also possible to do this for just an event, in which
|
||||
case the event will be available on ``https://tickets.awesomecon.org/``.
|
||||
|
||||
However, the admin panel will still only be available on ``pretix.eu`` for convenience and security reasons.
|
||||
``pretix.eu`` will redirect there. However, the admin panel will still only be available
|
||||
on ``pretix.eu`` for convenience and security reasons.
|
||||
|
||||
URL routing
|
||||
-----------
|
||||
|
||||
The hard part about implementing this URL routing in Django is that
|
||||
``https://pretix.eu/bigorg/awesomecon/`` contains two parameters of nearly arbitrary content
|
||||
and ``https://tickets.bigorg.com/awesomecon/`` contains only one and ``https://tickets.awesomecon.org/`` does not contain any.
|
||||
The only robust way to do this is by having *separate* URL configuration for those three cases.
|
||||
and ``https://tickets.bigorg.com/awesomecon/`` contains only one. The only robust way to do
|
||||
this is by having *separate* URL configuration for those two cases. In pretix, we call the
|
||||
former our ``maindomain`` config and the latter our ``subdomain`` config. For pretix's core
|
||||
modules we do some magic to avoid duplicate configuration, but for a fairly simple plugin with
|
||||
only a handful of routes, we recommend just configuring the two URL sets separately.
|
||||
|
||||
In pretix, we therefore do not have a global URL configuration, but three, living in the following modules:
|
||||
|
||||
- ``pretix.multidomain.maindomain_urlconf``
|
||||
- ``pretix.multidomain.organizer_domain_urlconf``
|
||||
- ``pretix.multidomain.event_domain_urlconf``
|
||||
|
||||
We provide some helper utilities to work with these to avoid duplicate configuration of the individual URLs.
|
||||
The file ``urls.py`` inside your plugin package will be loaded and scanned for URL configuration
|
||||
automatically and should be provided by any plugin that provides any view.
|
||||
However, unlike plain Django, we look not only for a ``urlpatterns`` attribute on the module but support other
|
||||
attributes like ``event_patterns`` and ``organizer_patterns`` as well.
|
||||
|
||||
For example, for a simple plugin that adds one URL to the backend and one event-level URL to the frontend, you can
|
||||
create the following configuration in your ``urls.py``::
|
||||
A very basic example that provides one view in the admin panel and one view in the frontend
|
||||
could look like this::
|
||||
|
||||
from django.urls import re_path
|
||||
|
||||
@@ -60,7 +52,7 @@ create the following configuration in your ``urls.py``::
|
||||
As you can see, the view in the frontend is not included in the standard Django ``urlpatterns``
|
||||
setting but in a separate list with the name ``event_patterns``. This will automatically prepend
|
||||
the appropriate parameters to the regex (e.g. the event or the event and the organizer, depending
|
||||
on the called domain). For organizer-level views, ``organizer_patterns`` works the same way.
|
||||
on the called domain).
|
||||
|
||||
If you only provide URLs in the admin area, you do not need to provide a ``event_patterns`` attribute.
|
||||
|
||||
@@ -79,16 +71,11 @@ is a python method that emulates a behavior similar to ``reverse``:
|
||||
|
||||
.. autofunction:: pretix.multidomain.urlreverse.eventreverse
|
||||
|
||||
If you need to communicate the URL externally, you can use a different method to ensure that it is always an absolute URL:
|
||||
|
||||
.. autofunction:: pretix.multidomain.urlreverse.build_absolute_uri
|
||||
|
||||
In addition, there is a template tag that works similar to ``url`` but takes an event or organizer object
|
||||
as its first argument and can be used like this::
|
||||
|
||||
{% load eventurl %}
|
||||
<a href="{% eventurl request.event "presale:event.checkout" step="payment" %}">Pay</a>
|
||||
<a href="{% abseventurl request.event "presale:event.checkout" step="payment" %}">Pay</a>
|
||||
|
||||
|
||||
Implementation details
|
||||
|
||||
@@ -12,4 +12,3 @@ Developer documentation
|
||||
api/index
|
||||
structure
|
||||
translation/index
|
||||
nfc/index
|
||||
|
||||
@@ -1,15 +0,0 @@
|
||||
NFC media
|
||||
=========
|
||||
|
||||
pretix supports using NFC chips as "reusable media", for example to store gift cards or tickets.
|
||||
|
||||
Most of this implementation currently lives in our proprietary app pretixPOS, but in the future might also become part of our open-source pretixSCAN solution.
|
||||
Either way, we want this to be an open ecosystem and therefore document the exact mechanisms in use on the following pages.
|
||||
|
||||
We support multiple implementations of NFC media, each documented on its own page:
|
||||
|
||||
.. toctree::
|
||||
:maxdepth: 2
|
||||
|
||||
uid
|
||||
mf0aes
|
||||
@@ -1,113 +0,0 @@
|
||||
Mifare Ultralight AES
|
||||
=====================
|
||||
|
||||
We offer an implementation that provides a higher security level than the UID-based approach and uses the `Mifare Ultralight AES`_ chip sold by NXP.
|
||||
We believe the security model of this approach is adequate to the situation where this will usually be used and we'll outline known risks below.
|
||||
|
||||
If you want to dive deeper into the properties of the Mifare Ultralight AES chip, we recommend reading the `data sheet`_.
|
||||
|
||||
Random UIDs
|
||||
-----------
|
||||
|
||||
Mifare Ultralight AES supports a feature that returns a randomized UID every time a non-authenticated user tries to
|
||||
read the UID. This has a strong privacy benefit, since no unauthorized entity can use the NFC chips to track users.
|
||||
On the other hand, this reduces interoperability of the system. For example, this prevents you from using the same NFC
|
||||
chips for a different purpose where you only need the UID. This will also prevent your guests from reading their UID
|
||||
themselves with their phones, which might be useful e.g. in debugging situations.
|
||||
|
||||
Since there's no one-size-fits-all choice here, you can enable or disable this feature in the pretix organizer
|
||||
settings. If you change it, the change will apply to all newly encoded chips after the change.
|
||||
|
||||
Key management
|
||||
--------------
|
||||
|
||||
For every organizer, the server will generate create a "key set", which consists of a publicly known ID (random 32-bit integer) and two 16-byte keys ("diversification key" and "UID key").
|
||||
|
||||
Using our :ref:`Device authentication mechanism <rest-deviceauth>`, an authorized device can submit a locally generated RSA public key to the server.
|
||||
This key can no longer changed on the server once it is set, thus protecting against the attack scenario of a leaked device API token.
|
||||
|
||||
The server will then include key sets in the response to ``/api/v1/device/info``, encrypted with the device's RSA key.
|
||||
This includes all key sets generated for the organizer the device belongs to, as well as all keys of organizers that have granted sufficient access to this organizer.
|
||||
|
||||
The device will decrypt the key sets using its RSA key and store the key sets locally.
|
||||
|
||||
.. warning:: The device **will** have access to the raw key sets. Therefore, there is a risk of leaked master keys if an
|
||||
authorized device is stolen or abused. Our implementation in pretixPOS attempts to make this very hard on
|
||||
modern, non-rooted Android devices by keeping them encrypted with the RSA key and only storing the RSA key
|
||||
in the hardware-backed keystore of the device. A sufficiently motivated attacker, however, will likely still
|
||||
be able to extract the keys from a stolen device.
|
||||
|
||||
Encoding a chip
|
||||
---------------
|
||||
|
||||
When a new chip is encoded, the following steps will be taken:
|
||||
|
||||
- The UID of the chip is retrieved.
|
||||
|
||||
- A chip-specific key is generated using the mechanism documented in `AN10922`_ using the "diversification key" from the
|
||||
organizer's key set as the CMAC key and the diversification input concatenated in the from of ``0x01 + UID + APPID + SYSTEMID``
|
||||
with the following values:
|
||||
|
||||
- The UID of the chip as ``UID``
|
||||
|
||||
- ``"eu.pretix"`` (``0x65 0x75 0x2e 0x70 0x72 0x65 0x74 0x69 0x78``) as ``APPID``
|
||||
|
||||
- The ``public_id`` from the organizer's key set as a 4-byte big-endian value as ``SYSTEMID``
|
||||
|
||||
- The chip-specific key is written to the chip as the "data protection key" (config pages 0x30 to 0x33)
|
||||
|
||||
- The UID key from the organizer's key set is written to the chip as the "UID retrieval key" (config pages 0x34 to 0x37)
|
||||
|
||||
- The config page 0x29 is set like this:
|
||||
|
||||
- ``RID_ACT`` (random UID) to ``1`` or ``0`` based on the organizer's configuration
|
||||
- ``SEC_MSG_ACT`` (secure messaging) to ``1``
|
||||
- ``AUTH0`` (first page that needs authentication) to 0x04 (first non-UID page)
|
||||
|
||||
- The config page 0x2A is set like this:
|
||||
|
||||
- ``PROT`` to ``0`` (only write access restricted, not read access)
|
||||
- ``AUTHLIM`` to ``256`` (maximum number of wrong authentications before "self-desctruction")
|
||||
- Everything else to its default value (no lock bits are set)
|
||||
|
||||
- The ``public_id`` of the key set will be written to page 0x04 as a big-endian value
|
||||
|
||||
- The UID of the chip will be registered as a reusable medium on the server.
|
||||
|
||||
.. warning:: During encoding, the chip-specific key and the UID key are transmitted in plain text over the air. The
|
||||
security model therefore relies on the encoding of chips being performed in a trusted physical environment
|
||||
to prevent a nearby attacker from sniffing the keys with a strong antenna.
|
||||
|
||||
.. note:: If an attacker tries to authenticate with the chip 256 times using the wrong key, the chip will become
|
||||
unusable. A chip may also become unusable if it is detached from the reader in the middle of the encoding
|
||||
process (even though we've tried to implement it in a way that makes this unlikely).
|
||||
|
||||
Usage
|
||||
-----
|
||||
|
||||
When a chip is presented to the NFC reader, the following steps will be taken:
|
||||
|
||||
- Command ``GET_VERSION`` is used to determine if it is a Mifare Ultralight AES chip (if not, abort).
|
||||
|
||||
- Page 0x04 is read. If it is all zeroes, the chip is considered un-encoded (abort). If it contains a value that
|
||||
corresponds to the ``public_id`` of a known key set, this key set is used for all further operations. If it contains
|
||||
a different value, we consider this chip to belong to a different organizer or not to a pretix system at all (abort).
|
||||
|
||||
- An authentication with the chip using the UID key is performed.
|
||||
|
||||
- The UID of the chip will be read.
|
||||
|
||||
- The chip-specific key will be derived using the mechanism described above in the encoding step.
|
||||
|
||||
- An authentication with the chip using the chip-specific key is performed. If this is fully successful, this step
|
||||
proves that the chip knows the same chip-specific key as we do and is therefore an authentic chip encoded by us and
|
||||
we can trust its UID value.
|
||||
|
||||
- The UID is transmitted to the server to fetch the correct medium.
|
||||
|
||||
During these steps, the keys are never transmitted in plain text and can thus not be sniffed by a nearby attacker
|
||||
with a strong antenna.
|
||||
|
||||
.. _Mifare Ultralight AES: https://www.nxp.com/products/rfid-nfc/mifare-hf/mifare-ultralight/mifare-ultralight-aes-enhanced-security-for-limited-use-contactless-applications:MF0AESx20
|
||||
.. _data sheet: https://www.nxp.com/docs/en/data-sheet/MF0AES(H)20.pdf
|
||||
.. _AN10922: https://www.nxp.com/docs/en/application-note/AN10922.pdf
|
||||
@@ -1,10 +0,0 @@
|
||||
UID-based
|
||||
=========
|
||||
|
||||
With UID-based NFC, only the unique ID (UID) of the NFC chip is used for identification purposes.
|
||||
This can be used with virtually all NFC chips that provide compatibility with the NFC reader in use, typically at least all chips that comply with ISO/IEC 14443-3A.
|
||||
|
||||
We make only one restriction: The UID may not start with ``08``, since that usually signifies a randomized UID that changes on every read (which would not be very useful).
|
||||
|
||||
.. warning:: The UID-based approach provides only a very low level of security. It is easy to clone a chip with the same
|
||||
UID and impersonate someone else.
|
||||
@@ -96,20 +96,6 @@ http://localhost:8000/control/ for the admin view.
|
||||
port (for example because you develop on `pretixdroid`_), you can check
|
||||
`Django's documentation`_ for more options.
|
||||
|
||||
When running the local development webserver, ensure Celery is not configured
|
||||
in ``pretix.cfg``. i.e., you should remove anything such as::
|
||||
|
||||
[celery]
|
||||
backend=redis://redis:6379/2
|
||||
broker=redis://redis:6379/2
|
||||
|
||||
If you choose to use Celery for development, you must also start a Celery worker
|
||||
process::
|
||||
|
||||
celery -A pretix.celery_app worker -l info
|
||||
|
||||
However, beware that code changes will not auto-reload within Celery.
|
||||
|
||||
.. _`checksandtests`:
|
||||
|
||||
Code checks and unit tests
|
||||
|
||||
@@ -293,16 +293,6 @@ with that information::
|
||||
</pretix-widget>
|
||||
|
||||
This works for the pretix Button as well, if you also specify a product.
|
||||
|
||||
As data-attributes are reactive, you can change them with JavaScript as well. Please note, that once the user
|
||||
started the checkout process, we do not update the data-attributes in the existing checkout process to not
|
||||
interrupt the checkout UX.
|
||||
|
||||
When updating data-attributes through JavaScript, make sure you do not have a stale reference to the HTMLNode of the
|
||||
widget. When the widget is created, the original HTMLNode can happen to be replaced. So make sure to always have a
|
||||
fresh reference like so
|
||||
``document.querySelectorAll("pretix-widget, pretix-button, .pretix-widget-wrapper")``
|
||||
|
||||
Currently, the following attributes are understood by pretix itself:
|
||||
|
||||
* ``data-email`` will pre-fill the order email field as well as the attendee email field (if enabled).
|
||||
@@ -339,72 +329,125 @@ Hosted or pretix Enterprise are active, you can pass the following fields:
|
||||
* If you use the campaigns plugin, you can pass a campaign ID as a value to ``data-campaign``. This way, all orders
|
||||
made through this widget will be counted towards this campaign.
|
||||
|
||||
* If you use the tracking plugin, you can enable cross-domain tracking. Please note: when you run your pretix-shop on a
|
||||
subdomain of your main tracking domain, then you do not need cross-domain tracking as tracking automatically works
|
||||
across subdomains. See :ref:`custom_domain` for how to set this up.
|
||||
* If you use the tracking plugin, you can enable cross-domain tracking. To do so, you need to initialize the
|
||||
pretix-widget manually. Use the html code to embed the widget and add one the following code snippets. Make sure to
|
||||
replace all occurrences of <MEASUREMENT_ID> with your Google Analytics MEASUREMENT_ID (UA-XXXXXXX-X or G-XXXXXXXX)
|
||||
|
||||
Please make sure to add the embedding website to your `Referral exclusions
|
||||
Please also make sure to add the embedding website to your `Referral exclusions
|
||||
<https://support.google.com/analytics/answer/2795830>`_ in your Google Analytics settings.
|
||||
|
||||
Add Google Analytics as you normally would with all your `window.dataLayer` and `gtag` configurations. Also add the
|
||||
widget code normally. Then you have two options:
|
||||
If you use Google Analytics 4 (GA4 – G-XXXXXXXX)::
|
||||
|
||||
* Block loading of the widget at most 2 seconds or until Google’s client- and session-ID are loaded. This method
|
||||
uses `window.pretixWidgetCallback`. Note that if it takes longer than 2 seconds to load, client- and session-ID
|
||||
are never passed to the widget. Make sure to replace all occurrences of <MEASUREMENT_ID> with your Google
|
||||
Analytics MEASUREMENT_ID (G-XXXXXXXX)::
|
||||
<script async src="https://www.googletagmanager.com/gtag/js?id=<MEASUREMENT_ID>"></script>
|
||||
<script type="text/javascript">
|
||||
window.dataLayer = window.dataLayer || [];
|
||||
function gtag(){dataLayer.push(arguments);}
|
||||
gtag('js', new Date());
|
||||
gtag('config', '<MEASUREMENT_ID>');
|
||||
|
||||
<script type="text/javascript">
|
||||
window.pretixWidgetCallback = function () {
|
||||
window.PretixWidget.build_widgets = false;
|
||||
window.addEventListener('load', function() { // Wait for GA to be loaded
|
||||
if (!window['google_tag_manager']) {
|
||||
window.PretixWidget.buildWidgets();
|
||||
return;
|
||||
}
|
||||
window.pretixWidgetCallback = function () {
|
||||
window.PretixWidget.build_widgets = false;
|
||||
window.addEventListener('load', function() { // Wait for GA to be loaded
|
||||
if (!window['google_tag_manager']) {
|
||||
window.PretixWidget.buildWidgets();
|
||||
return;
|
||||
}
|
||||
|
||||
var clientId;
|
||||
var sessionId;
|
||||
var loadingTimeout;
|
||||
function build() {
|
||||
// use loadingTimeout to make sure build() is only called once
|
||||
if (!loadingTimeout) return;
|
||||
window.clearTimeout(loadingTimeout);
|
||||
loadingTimeout = null;
|
||||
if (clientId) window.PretixWidget.widget_data["tracking-ga-id"] = clientId;
|
||||
if (sessionId) window.PretixWidget.widget_data["tracking-ga-sessid"] = sessionId;
|
||||
window.PretixWidget.buildWidgets();
|
||||
};
|
||||
// make sure to build pretix-widgets if gtag fails to load either client_id or session_id
|
||||
loadingTimeout = window.setTimeout(build, 2000);
|
||||
var clientId;
|
||||
var sessionId;
|
||||
var loadingTimeout;
|
||||
function build() {
|
||||
// use loadingTimeout to make sure build() is only called once
|
||||
if (!loadingTimeout) return;
|
||||
window.clearTimeout(loadingTimeout);
|
||||
loadingTimeout = null;
|
||||
if (clientId) window.PretixWidget.widget_data["tracking-ga-id"] = clientId;
|
||||
if (sessionId) window.PretixWidget.widget_data["tracking-ga-sessid"] = sessionId;
|
||||
window.PretixWidget.buildWidgets();
|
||||
};
|
||||
// make sure to build pretix-widgets if gtag fails to load either client_id or session_id
|
||||
loadingTimeout = window.setTimeout(build, 2000);
|
||||
|
||||
gtag('get', '<MEASUREMENT_ID>', 'client_id', function(id) {
|
||||
clientId = id;
|
||||
if (sessionId !== undefined) build();
|
||||
});
|
||||
gtag('get', '<MEASUREMENT_ID>', 'session_id', function(id) {
|
||||
sessionId = id;
|
||||
if (clientId !== undefined) build();
|
||||
});
|
||||
});
|
||||
};
|
||||
</script>
|
||||
|
||||
* Or asynchronously set data-attributes – the widgets are shown immediately, but once the user has started checkout,
|
||||
data-attributes are not updated. Make sure to replace all occurrences of <MEASUREMENT_ID> with your Google
|
||||
Analytics MEASUREMENT_ID (G-XXXXXXXX)::
|
||||
|
||||
<script type="text/javascript">
|
||||
window.addEventListener('load', function() {
|
||||
gtag('get', '<MEASUREMENT_ID>', 'client_id', function(id) {
|
||||
const widgets = document.querySelectorAll("pretix-widget, pretix-button, .pretix-widget-wrapper");
|
||||
widgets.forEach(widget => widget.setAttribute("data-tracking-ga-id", id))
|
||||
clientId = id;
|
||||
if (sessionId !== undefined) build();
|
||||
});
|
||||
gtag('get', '<MEASUREMENT_ID>', 'session_id', function(id) {
|
||||
const widgets = document.querySelectorAll("pretix-widget, pretix-button, .pretix-widget-wrapper");
|
||||
widgets.forEach(widget => widget.setAttribute("data-tracking-ga-sessid", id))
|
||||
sessionId = id;
|
||||
if (clientId !== undefined) build();
|
||||
});
|
||||
});
|
||||
</script>
|
||||
};
|
||||
</script>
|
||||
|
||||
If you use Universal Analytics with ``gtag.js`` (UA-XXXXXXX-X)::
|
||||
|
||||
<script async src="https://www.googletagmanager.com/gtag/js?id=<MEASUREMENT_ID>"></script>
|
||||
<script type="text/javascript">
|
||||
window.dataLayer = window.dataLayer || [];
|
||||
function gtag(){dataLayer.push(arguments);}
|
||||
gtag('js', new Date());
|
||||
gtag('config', '<MEASUREMENT_ID>');
|
||||
|
||||
window.pretixWidgetCallback = function () {
|
||||
window.PretixWidget.build_widgets = false;
|
||||
window.addEventListener('load', function() { // Wait for GA to be loaded
|
||||
if (!window['google_tag_manager']) {
|
||||
window.PretixWidget.buildWidgets();
|
||||
return;
|
||||
}
|
||||
|
||||
// make sure to build pretix-widgets if gtag fails to load client_id
|
||||
var loadingTimeout = window.setTimeout(function() {
|
||||
loadingTimeout = null;
|
||||
window.PretixWidget.buildWidgets();
|
||||
}, 1000);
|
||||
|
||||
gtag('get', '<MEASUREMENT_ID>', 'client_id', function(id) {
|
||||
if (loadingTimeout) {
|
||||
window.clearTimeout(loadingTimeout);
|
||||
window.PretixWidget.widget_data["tracking-ga-id"] = id;
|
||||
window.PretixWidget.buildWidgets();
|
||||
}
|
||||
});
|
||||
});
|
||||
};
|
||||
</script>
|
||||
|
||||
If you use ``analytics.js`` (Universal Analytics)::
|
||||
|
||||
<script>
|
||||
(function(i,s,o,g,r,a,m){i['GoogleAnalyticsObject']=r;i[r]=i[r]||function(){
|
||||
(i[r].q=i[r].q||[]).push(arguments)},i[r].l=1*new Date();a=s.createElement(o),
|
||||
m=s.getElementsByTagName(o)[0];a.async=1;a.src=g;m.parentNode.insertBefore(a,m)
|
||||
})(window,document,'script','https://www.google-analytics.com/analytics.js','ga');
|
||||
|
||||
ga('create', '<MEASUREMENT_ID>', 'auto');
|
||||
ga('send', 'pageview');
|
||||
|
||||
window.pretixWidgetCallback = function () {
|
||||
window.PretixWidget.build_widgets = false;
|
||||
window.addEventListener('load', function() { // Wait for GA to be loaded
|
||||
if (!window['ga'] || !ga.create) {
|
||||
// Tracking is probably blocked
|
||||
window.PretixWidget.buildWidgets()
|
||||
return;
|
||||
}
|
||||
|
||||
var loadingTimeout = window.setTimeout(function() {
|
||||
loadingTimeout = null;
|
||||
window.PretixWidget.buildWidgets();
|
||||
}, 1000);
|
||||
ga(function(tracker) {
|
||||
if (loadingTimeout) {
|
||||
window.clearTimeout(loadingTimeout);
|
||||
window.PretixWidget.widget_data["tracking-ga-id"] = tracker.get('clientId');
|
||||
window.PretixWidget.buildWidgets();
|
||||
}
|
||||
});
|
||||
});
|
||||
};
|
||||
</script>
|
||||
|
||||
|
||||
.. _Let's Encrypt: https://letsencrypt.org/
|
||||
|
||||
@@ -36,7 +36,7 @@ dependencies = [
|
||||
"css-inline==0.8.*",
|
||||
"defusedcsv>=1.1.0",
|
||||
"dj-static",
|
||||
"Django==4.2.*",
|
||||
"Django==4.1.*",
|
||||
"django-bootstrap3==23.1.*",
|
||||
"django-compressor==4.3.*",
|
||||
"django-countries==7.5.*",
|
||||
@@ -90,7 +90,7 @@ dependencies = [
|
||||
"pytz-deprecation-shim==0.1.*",
|
||||
"pyuca",
|
||||
"qrcode==7.4.*",
|
||||
"redis==4.6.*",
|
||||
"redis==4.5.*,>=4.5.4",
|
||||
"reportlab==4.0.*",
|
||||
"requests==2.31.*",
|
||||
"sentry-sdk==1.15.*",
|
||||
@@ -110,10 +110,8 @@ dependencies = [
|
||||
[project.optional-dependencies]
|
||||
memcached = ["pylibmc"]
|
||||
dev = [
|
||||
"aiohttp==3.8.*",
|
||||
"coverage",
|
||||
"coveralls",
|
||||
"fakeredis==2.18.*",
|
||||
"flake8==6.0.*",
|
||||
"freezegun",
|
||||
"isort==5.12.*",
|
||||
@@ -121,7 +119,6 @@ dev = [
|
||||
"potypo",
|
||||
"pycodestyle==2.10.*",
|
||||
"pyflakes==3.0.*",
|
||||
"pytest-asyncio",
|
||||
"pytest-cache",
|
||||
"pytest-cov",
|
||||
"pytest-django==4.*",
|
||||
|
||||
@@ -19,4 +19,4 @@
|
||||
# You should have received a copy of the GNU Affero General Public License along with this program. If not, see
|
||||
# <https://www.gnu.org/licenses/>.
|
||||
#
|
||||
__version__ = "2023.8.0.dev0"
|
||||
__version__ = "2023.6.3"
|
||||
|
||||
@@ -89,7 +89,6 @@ ALL_LANGUAGES = [
|
||||
('fi', _('Finnish')),
|
||||
('gl', _('Galician')),
|
||||
('el', _('Greek')),
|
||||
('id', _('Indonesian')),
|
||||
('it', _('Italian')),
|
||||
('lv', _('Latvian')),
|
||||
('pl', _('Polish')),
|
||||
@@ -197,14 +196,7 @@ STATICFILES_DIRS = [
|
||||
|
||||
STATICI18N_ROOT = os.path.join(BASE_DIR, "pretix/static")
|
||||
|
||||
STORAGES = {
|
||||
"default": {
|
||||
"BACKEND": "django.core.files.storage.FileSystemStorage",
|
||||
},
|
||||
"staticfiles": {
|
||||
"BACKEND": "django.contrib.staticfiles.storage.ManifestStaticFilesStorage",
|
||||
},
|
||||
}
|
||||
STATICFILES_STORAGE = 'django.contrib.staticfiles.storage.ManifestStaticFilesStorage'
|
||||
|
||||
# if os.path.exists(os.path.join(DATA_DIR, 'static')):
|
||||
# STATICFILES_DIRS.insert(0, os.path.join(DATA_DIR, 'static'))
|
||||
|
||||
@@ -223,7 +223,6 @@ class PretixPosSecurityProfile(AllowListSecurityProfile):
|
||||
('POST', 'api-v1:checkinrpc.redeem'),
|
||||
('GET', 'api-v1:checkinrpc.search'),
|
||||
('POST', 'api-v1:reusablemedium-lookup'),
|
||||
('POST', 'api-v1:reusablemedium-list'),
|
||||
)
|
||||
|
||||
|
||||
|
||||
@@ -32,13 +32,11 @@ class DiscountSerializer(I18nAwareModelSerializer):
|
||||
'available_until', 'subevent_mode', 'condition_all_products', 'condition_limit_products',
|
||||
'condition_apply_to_addons', 'condition_min_count', 'condition_min_value',
|
||||
'benefit_discount_matching_percent', 'benefit_only_apply_to_cheapest_n_matches',
|
||||
'benefit_same_products', 'benefit_limit_products', 'benefit_apply_to_addons',
|
||||
'benefit_ignore_voucher_discounted', 'condition_ignore_voucher_discounted')
|
||||
'condition_ignore_voucher_discounted')
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
super().__init__(*args, **kwargs)
|
||||
self.fields['condition_limit_products'].queryset = self.context['event'].items.all()
|
||||
self.fields['benefit_limit_products'].queryset = self.context['event'].items.all()
|
||||
|
||||
def validate(self, data):
|
||||
data = super().validate(data)
|
||||
|
||||
@@ -728,7 +728,6 @@ class EventSettingsSerializer(SettingsSerializer):
|
||||
'payment_term_minutes',
|
||||
'payment_term_last',
|
||||
'payment_term_expire_automatically',
|
||||
'payment_term_expire_delay_days',
|
||||
'payment_term_accept_late',
|
||||
'payment_explanation',
|
||||
'payment_pending_hidden',
|
||||
@@ -817,10 +816,6 @@ class EventSettingsSerializer(SettingsSerializer):
|
||||
'reusable_media_type_nfc_uid',
|
||||
'reusable_media_type_nfc_uid_autocreate_giftcard',
|
||||
'reusable_media_type_nfc_uid_autocreate_giftcard_currency',
|
||||
'reusable_media_type_nfc_mf0aes',
|
||||
'reusable_media_type_nfc_mf0aes_autocreate_giftcard',
|
||||
'reusable_media_type_nfc_mf0aes_autocreate_giftcard_currency',
|
||||
'reusable_media_type_nfc_mf0aes_random_uid',
|
||||
]
|
||||
readonly_fields = [
|
||||
# These are read-only since they are currently only settable on organizers, not events
|
||||
@@ -830,10 +825,6 @@ class EventSettingsSerializer(SettingsSerializer):
|
||||
'reusable_media_type_nfc_uid',
|
||||
'reusable_media_type_nfc_uid_autocreate_giftcard',
|
||||
'reusable_media_type_nfc_uid_autocreate_giftcard_currency',
|
||||
'reusable_media_type_nfc_mf0aes',
|
||||
'reusable_media_type_nfc_mf0aes_autocreate_giftcard',
|
||||
'reusable_media_type_nfc_mf0aes_autocreate_giftcard_currency',
|
||||
'reusable_media_type_nfc_mf0aes_random_uid',
|
||||
]
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
@@ -902,8 +893,6 @@ class DeviceEventSettingsSerializer(EventSettingsSerializer):
|
||||
'name_scheme',
|
||||
'reusable_media_type_barcode',
|
||||
'reusable_media_type_nfc_uid',
|
||||
'reusable_media_type_nfc_mf0aes',
|
||||
'reusable_media_type_nfc_mf0aes_random_uid',
|
||||
'system_question_order',
|
||||
]
|
||||
|
||||
|
||||
@@ -22,13 +22,11 @@
|
||||
import logging
|
||||
import os
|
||||
from collections import Counter, defaultdict
|
||||
from datetime import timedelta
|
||||
from decimal import Decimal
|
||||
|
||||
import pycountry
|
||||
from django.conf import settings
|
||||
from django.core.files import File
|
||||
from django.db import models
|
||||
from django.db.models import F, Q
|
||||
from django.utils.encoding import force_str
|
||||
from django.utils.timezone import now
|
||||
@@ -60,11 +58,10 @@ from pretix.base.models.orders import (
|
||||
)
|
||||
from pretix.base.pdf import get_images, get_variables
|
||||
from pretix.base.services.cart import error_messages
|
||||
from pretix.base.services.locking import LOCK_TRUST_WINDOW, lock_objects
|
||||
from pretix.base.services.locking import NoLockManager
|
||||
from pretix.base.services.pricing import (
|
||||
apply_discounts, get_line_price, get_listed_price, is_included_for_free,
|
||||
)
|
||||
from pretix.base.services.quotas import QuotaAvailability
|
||||
from pretix.base.settings import COUNTRIES_WITH_STATE_IN_ADDRESS
|
||||
from pretix.base.signals import register_ticket_outputs
|
||||
from pretix.helpers.countries import CachedCountries
|
||||
@@ -286,12 +283,11 @@ class FailedCheckinSerializer(I18nAwareModelSerializer):
|
||||
raw_item = serializers.PrimaryKeyRelatedField(queryset=Item.objects.none(), required=False, allow_null=True)
|
||||
raw_variation = serializers.PrimaryKeyRelatedField(queryset=ItemVariation.objects.none(), required=False, allow_null=True)
|
||||
raw_subevent = serializers.PrimaryKeyRelatedField(queryset=SubEvent.objects.none(), required=False, allow_null=True)
|
||||
nonce = serializers.CharField(required=False, allow_null=True)
|
||||
|
||||
class Meta:
|
||||
model = Checkin
|
||||
fields = ('error_reason', 'error_explanation', 'raw_barcode', 'raw_item', 'raw_variation',
|
||||
'raw_subevent', 'nonce', 'datetime', 'type', 'position')
|
||||
'raw_subevent', 'datetime', 'type', 'position')
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
super().__init__(*args, **kwargs)
|
||||
@@ -376,15 +372,11 @@ class PdfDataSerializer(serializers.Field):
|
||||
self.context['vars_images'] = get_images(self.context['event'])
|
||||
|
||||
for k, f in self.context['vars'].items():
|
||||
if 'evaluate_bulk' in f:
|
||||
# Will be evaluated later by our list serializers
|
||||
res[k] = (f['evaluate_bulk'], instance)
|
||||
else:
|
||||
try:
|
||||
res[k] = f['evaluate'](instance, instance.order, ev)
|
||||
except:
|
||||
logger.exception('Evaluating PDF variable failed')
|
||||
res[k] = '(error)'
|
||||
try:
|
||||
res[k] = f['evaluate'](instance, instance.order, ev)
|
||||
except:
|
||||
logger.exception('Evaluating PDF variable failed')
|
||||
res[k] = '(error)'
|
||||
|
||||
if not hasattr(ev, '_cached_meta_data'):
|
||||
ev._cached_meta_data = ev.meta_data
|
||||
@@ -437,38 +429,6 @@ class PdfDataSerializer(serializers.Field):
|
||||
return res
|
||||
|
||||
|
||||
class OrderPositionListSerializer(serializers.ListSerializer):
|
||||
|
||||
def to_representation(self, data):
|
||||
# We have a custom implementation of this method because PdfDataSerializer() might keep some elements unevaluated
|
||||
# with a (callable, input) tuple. We'll loop over these entries and evaluate them bulk-wise to save on SQL queries.
|
||||
|
||||
if isinstance(self.parent, OrderSerializer) and isinstance(self.parent.parent, OrderListSerializer):
|
||||
# Do not execute our custom code because it will be executed by OrderListSerializer later for the
|
||||
# full result set.
|
||||
return super().to_representation(data)
|
||||
|
||||
iterable = data.all() if isinstance(data, models.Manager) else data
|
||||
|
||||
data = []
|
||||
evaluate_queue = defaultdict(list)
|
||||
|
||||
for item in iterable:
|
||||
entry = self.child.to_representation(item)
|
||||
if "pdf_data" in entry:
|
||||
for k, v in entry["pdf_data"].items():
|
||||
if isinstance(v, tuple) and callable(v[0]):
|
||||
evaluate_queue[v[0]].append((v[1], entry, k))
|
||||
data.append(entry)
|
||||
|
||||
for func, entries in evaluate_queue.items():
|
||||
results = func([item for (item, entry, k) in entries])
|
||||
for (item, entry, k), result in zip(entries, results):
|
||||
entry["pdf_data"][k] = result
|
||||
|
||||
return data
|
||||
|
||||
|
||||
class OrderPositionSerializer(I18nAwareModelSerializer):
|
||||
checkins = CheckinSerializer(many=True, read_only=True)
|
||||
answers = AnswerSerializer(many=True)
|
||||
@@ -480,7 +440,6 @@ class OrderPositionSerializer(I18nAwareModelSerializer):
|
||||
attendee_name = serializers.CharField(required=False)
|
||||
|
||||
class Meta:
|
||||
list_serializer_class = OrderPositionListSerializer
|
||||
model = OrderPosition
|
||||
fields = ('id', 'order', 'positionid', 'item', 'variation', 'price', 'attendee_name', 'attendee_name_parts',
|
||||
'company', 'street', 'zipcode', 'city', 'country', 'state', 'discount',
|
||||
@@ -509,20 +468,6 @@ class OrderPositionSerializer(I18nAwareModelSerializer):
|
||||
def validate(self, data):
|
||||
raise TypeError("this serializer is readonly")
|
||||
|
||||
def to_representation(self, data):
|
||||
if isinstance(self.parent, (OrderListSerializer, OrderPositionListSerializer)):
|
||||
# Do not execute our custom code because it will be executed by OrderListSerializer later for the
|
||||
# full result set.
|
||||
return super().to_representation(data)
|
||||
|
||||
entry = super().to_representation(data)
|
||||
if "pdf_data" in entry:
|
||||
for k, v in entry["pdf_data"].items():
|
||||
if isinstance(v, tuple) and callable(v[0]):
|
||||
entry["pdf_data"][k] = v[0]([v[1]])[0]
|
||||
|
||||
return entry
|
||||
|
||||
|
||||
class RequireAttentionField(serializers.Field):
|
||||
def to_representation(self, instance: OrderPosition):
|
||||
@@ -617,7 +562,7 @@ class PaymentURLField(serializers.URLField):
|
||||
def to_representation(self, instance: OrderPayment):
|
||||
if instance.state != OrderPayment.PAYMENT_STATE_CREATED:
|
||||
return None
|
||||
return build_absolute_uri(instance.order.event, 'presale:event.order.pay', kwargs={
|
||||
return build_absolute_uri(self.context['event'], 'presale:event.order.pay', kwargs={
|
||||
'order': instance.order.code,
|
||||
'secret': instance.order.secret,
|
||||
'payment': instance.pk,
|
||||
@@ -662,42 +607,13 @@ class OrderRefundSerializer(I18nAwareModelSerializer):
|
||||
|
||||
class OrderURLField(serializers.URLField):
|
||||
def to_representation(self, instance: Order):
|
||||
return build_absolute_uri(instance.event, 'presale:event.order', kwargs={
|
||||
return build_absolute_uri(self.context['event'], 'presale:event.order', kwargs={
|
||||
'order': instance.code,
|
||||
'secret': instance.secret,
|
||||
})
|
||||
|
||||
|
||||
class OrderListSerializer(serializers.ListSerializer):
|
||||
|
||||
def to_representation(self, data):
|
||||
# We have a custom implementation of this method because PdfDataSerializer() might keep some elements
|
||||
# unevaluated with a (callable, input) tuple. We'll loop over these entries and evaluate them bulk-wise to
|
||||
# save on SQL queries.
|
||||
iterable = data.all() if isinstance(data, models.Manager) else data
|
||||
|
||||
data = []
|
||||
evaluate_queue = defaultdict(list)
|
||||
|
||||
for item in iterable:
|
||||
entry = self.child.to_representation(item)
|
||||
for p in entry.get("positions", []):
|
||||
if "pdf_data" in p:
|
||||
for k, v in p["pdf_data"].items():
|
||||
if isinstance(v, tuple) and callable(v[0]):
|
||||
evaluate_queue[v[0]].append((v[1], p, k))
|
||||
data.append(entry)
|
||||
|
||||
for func, entries in evaluate_queue.items():
|
||||
results = func([item for (item, entry, k) in entries])
|
||||
for (item, entry, k), result in zip(entries, results):
|
||||
entry["pdf_data"][k] = result
|
||||
|
||||
return data
|
||||
|
||||
|
||||
class OrderSerializer(I18nAwareModelSerializer):
|
||||
event = SlugRelatedField(slug_field='slug', read_only=True)
|
||||
invoice_address = InvoiceAddressSerializer(allow_null=True)
|
||||
positions = OrderPositionSerializer(many=True, read_only=True)
|
||||
fees = OrderFeeSerializer(many=True, read_only=True)
|
||||
@@ -711,9 +627,8 @@ class OrderSerializer(I18nAwareModelSerializer):
|
||||
|
||||
class Meta:
|
||||
model = Order
|
||||
list_serializer_class = OrderListSerializer
|
||||
fields = (
|
||||
'code', 'event', 'status', 'testmode', 'secret', 'email', 'phone', 'locale', 'datetime', 'expires', 'payment_date',
|
||||
'code', 'status', 'testmode', 'secret', 'email', 'phone', 'locale', 'datetime', 'expires', 'payment_date',
|
||||
'payment_provider', 'fees', 'total', 'comment', 'custom_followup_at', 'invoice_address', 'positions', 'downloads',
|
||||
'checkin_attention', 'last_modified', 'payments', 'refunds', 'require_approval', 'sales_channel',
|
||||
'url', 'customer', 'valid_if_pending'
|
||||
@@ -1146,367 +1061,338 @@ class OrderCreateSerializer(I18nAwareModelSerializer):
|
||||
else:
|
||||
ia = None
|
||||
|
||||
quotas_by_item = {}
|
||||
quota_diff_for_locking = Counter()
|
||||
voucher_diff_for_locking = Counter()
|
||||
seat_diff_for_locking = Counter()
|
||||
quota_usage = Counter()
|
||||
voucher_usage = Counter()
|
||||
seat_usage = Counter()
|
||||
v_budget = {}
|
||||
now_dt = now()
|
||||
delete_cps = []
|
||||
consume_carts = validated_data.pop('consume_carts', [])
|
||||
|
||||
lock_required = False
|
||||
for pos_data in positions_data:
|
||||
if (pos_data.get('item'), pos_data.get('variation'), pos_data.get('subevent')) not in quotas_by_item:
|
||||
quotas_by_item[pos_data.get('item'), pos_data.get('variation'), pos_data.get('subevent')] = list(
|
||||
pos_data.get('variation').quotas.filter(subevent=pos_data.get('subevent'))
|
||||
if pos_data.get('variation')
|
||||
else pos_data.get('item').quotas.filter(subevent=pos_data.get('subevent'))
|
||||
)
|
||||
for q in quotas_by_item[pos_data.get('item'), pos_data.get('variation'), pos_data.get('subevent')]:
|
||||
quota_diff_for_locking[q] += 1
|
||||
if pos_data.get('voucher'):
|
||||
voucher_diff_for_locking[pos_data['voucher']] += 1
|
||||
if pos_data.get('seat'):
|
||||
try:
|
||||
seat = self.context['event'].seats.get(seat_guid=pos_data['seat'], subevent=pos_data.get('subevent'))
|
||||
except Seat.DoesNotExist:
|
||||
pos_data['seat'] = Seat.DoesNotExist
|
||||
else:
|
||||
pos_data['seat'] = seat
|
||||
seat_diff_for_locking[pos_data['seat']] += 1
|
||||
pos_data['_quotas'] = list(
|
||||
pos_data.get('variation').quotas.filter(subevent=pos_data.get('subevent'))
|
||||
if pos_data.get('variation')
|
||||
else pos_data.get('item').quotas.filter(subevent=pos_data.get('subevent'))
|
||||
)
|
||||
if pos_data.get('voucher') or pos_data.get('seat') or any(q.size is not None for q in pos_data['_quotas']):
|
||||
lock_required = True
|
||||
|
||||
if consume_carts:
|
||||
offset = now() + timedelta(seconds=LOCK_TRUST_WINDOW)
|
||||
for cp in CartPosition.objects.filter(
|
||||
event=self.context['event'], cart_id__in=consume_carts, expires__gt=now_dt
|
||||
):
|
||||
quotas = (cp.variation.quotas.filter(subevent=cp.subevent)
|
||||
if cp.variation else cp.item.quotas.filter(subevent=cp.subevent))
|
||||
for quota in quotas:
|
||||
if cp.expires > offset:
|
||||
quota_diff_for_locking[quota] -= 1
|
||||
quota_usage[quota] -= 1
|
||||
if cp.voucher:
|
||||
if cp.expires > offset:
|
||||
voucher_diff_for_locking[cp.voucher] -= 1
|
||||
voucher_usage[cp.voucher] -= 1
|
||||
if cp.seat:
|
||||
if cp.expires > offset:
|
||||
seat_diff_for_locking[cp.seat] -= 1
|
||||
seat_usage[cp.seat] -= 1
|
||||
delete_cps.append(cp)
|
||||
lockfn = self.context['event'].lock
|
||||
if simulate or not lock_required:
|
||||
lockfn = NoLockManager
|
||||
with lockfn() as now_dt:
|
||||
free_seats = set()
|
||||
seats_seen = set()
|
||||
consume_carts = validated_data.pop('consume_carts', [])
|
||||
delete_cps = []
|
||||
quota_avail_cache = {}
|
||||
v_budget = {}
|
||||
voucher_usage = Counter()
|
||||
if consume_carts:
|
||||
for cp in CartPosition.objects.filter(
|
||||
event=self.context['event'], cart_id__in=consume_carts, expires__gt=now()
|
||||
):
|
||||
quotas = (cp.variation.quotas.filter(subevent=cp.subevent)
|
||||
if cp.variation else cp.item.quotas.filter(subevent=cp.subevent))
|
||||
for quota in quotas:
|
||||
if quota not in quota_avail_cache:
|
||||
quota_avail_cache[quota] = list(quota.availability())
|
||||
if quota_avail_cache[quota][1] is not None:
|
||||
quota_avail_cache[quota][1] += 1
|
||||
if cp.voucher:
|
||||
voucher_usage[cp.voucher] -= 1
|
||||
if cp.expires > now_dt:
|
||||
if cp.seat:
|
||||
free_seats.add(cp.seat)
|
||||
delete_cps.append(cp)
|
||||
|
||||
if not simulate:
|
||||
full_lock_required = seat_diff_for_locking and self.context['event'].settings.seating_minimal_distance > 0
|
||||
if full_lock_required:
|
||||
# We lock the entire event in this case since we don't want to deal with fine-granular locking
|
||||
# in the case of seating distance enforcement
|
||||
lock_objects([self.context['event']])
|
||||
else:
|
||||
lock_objects(
|
||||
[q for q, d in quota_diff_for_locking.items() if d > 0 and q.size is not None and not force] +
|
||||
[v for v, d in voucher_diff_for_locking.items() if d > 0 and not force] +
|
||||
[s for s, d in seat_diff_for_locking.items() if d > 0],
|
||||
shared_lock_objects=[self.context['event']]
|
||||
)
|
||||
errs = [{} for p in positions_data]
|
||||
|
||||
qa = QuotaAvailability()
|
||||
qa.queue(*[q for q, d in quota_diff_for_locking.items() if d > 0])
|
||||
qa.compute()
|
||||
for i, pos_data in enumerate(positions_data):
|
||||
|
||||
# These are not technically correct as diff use due to the time offset applied above, so let's prevent accidental
|
||||
# use further down
|
||||
del quota_diff_for_locking, voucher_diff_for_locking, seat_diff_for_locking
|
||||
if pos_data.get('voucher'):
|
||||
v = pos_data['voucher']
|
||||
|
||||
errs = [{} for p in positions_data]
|
||||
if pos_data.get('addon_to'):
|
||||
errs[i]['voucher'] = ['Vouchers are currently not supported for add-on products.']
|
||||
continue
|
||||
|
||||
for i, pos_data in enumerate(positions_data):
|
||||
if pos_data.get('voucher'):
|
||||
v = pos_data['voucher']
|
||||
if not v.applies_to(pos_data['item'], pos_data.get('variation')):
|
||||
errs[i]['voucher'] = [error_messages['voucher_invalid_item']]
|
||||
continue
|
||||
|
||||
if pos_data.get('addon_to'):
|
||||
errs[i]['voucher'] = ['Vouchers are currently not supported for add-on products.']
|
||||
continue
|
||||
if v.subevent_id and pos_data.get('subevent').pk != v.subevent_id:
|
||||
errs[i]['voucher'] = [error_messages['voucher_invalid_subevent']]
|
||||
continue
|
||||
|
||||
if not v.applies_to(pos_data['item'], pos_data.get('variation')):
|
||||
errs[i]['voucher'] = [error_messages['voucher_invalid_item']]
|
||||
continue
|
||||
if v.valid_until is not None and v.valid_until < now_dt:
|
||||
errs[i]['voucher'] = [error_messages['voucher_expired']]
|
||||
continue
|
||||
|
||||
if v.subevent_id and pos_data.get('subevent').pk != v.subevent_id:
|
||||
errs[i]['voucher'] = [error_messages['voucher_invalid_subevent']]
|
||||
continue
|
||||
voucher_usage[v] += 1
|
||||
if voucher_usage[v] > 0:
|
||||
redeemed_in_carts = CartPosition.objects.filter(
|
||||
Q(voucher=pos_data['voucher']) & Q(event=self.context['event']) & Q(expires__gte=now_dt)
|
||||
).exclude(pk__in=[cp.pk for cp in delete_cps])
|
||||
v_avail = v.max_usages - v.redeemed - redeemed_in_carts.count()
|
||||
if v_avail < voucher_usage[v]:
|
||||
errs[i]['voucher'] = [
|
||||
'The voucher has already been used the maximum number of times.'
|
||||
]
|
||||
|
||||
if v.valid_until is not None and v.valid_until < now_dt:
|
||||
errs[i]['voucher'] = [error_messages['voucher_expired']]
|
||||
continue
|
||||
if v.budget is not None:
|
||||
price = pos_data.get('price')
|
||||
listed_price = get_listed_price(pos_data.get('item'), pos_data.get('variation'), pos_data.get('subevent'))
|
||||
|
||||
voucher_usage[v] += 1
|
||||
if voucher_usage[v] > 0:
|
||||
redeemed_in_carts = CartPosition.objects.filter(
|
||||
Q(voucher=pos_data['voucher']) & Q(event=self.context['event']) & Q(expires__gte=now_dt)
|
||||
).exclude(pk__in=[cp.pk for cp in delete_cps])
|
||||
v_avail = v.max_usages - v.redeemed - redeemed_in_carts.count()
|
||||
if v_avail < voucher_usage[v]:
|
||||
errs[i]['voucher'] = [
|
||||
'The voucher has already been used the maximum number of times.'
|
||||
]
|
||||
if pos_data.get('voucher'):
|
||||
price_after_voucher = pos_data.get('voucher').calculate_price(listed_price)
|
||||
else:
|
||||
price_after_voucher = listed_price
|
||||
if price is None:
|
||||
price = price_after_voucher
|
||||
|
||||
if v.budget is not None:
|
||||
price = pos_data.get('price')
|
||||
listed_price = get_listed_price(pos_data.get('item'), pos_data.get('variation'), pos_data.get('subevent'))
|
||||
if v not in v_budget:
|
||||
v_budget[v] = v.budget - v.budget_used()
|
||||
disc = max(listed_price - price, 0)
|
||||
if disc > v_budget[v]:
|
||||
new_disc = v_budget[v]
|
||||
v_budget[v] -= new_disc
|
||||
if new_disc == Decimal('0.00') or pos_data.get('price') is not None:
|
||||
errs[i]['voucher'] = [
|
||||
'The voucher has a remaining budget of {}, therefore a discount of {} can not be '
|
||||
'given.'.format(v_budget[v] + new_disc, disc)
|
||||
]
|
||||
continue
|
||||
pos_data['price'] = price + (disc - new_disc)
|
||||
else:
|
||||
v_budget[v] -= disc
|
||||
|
||||
seated = pos_data.get('item').seat_category_mappings.filter(subevent=pos_data.get('subevent')).exists()
|
||||
if pos_data.get('seat'):
|
||||
if pos_data.get('addon_to'):
|
||||
errs[i]['seat'] = ['Seats are currently not supported for add-on products.']
|
||||
continue
|
||||
|
||||
if not seated:
|
||||
errs[i]['seat'] = ['The specified product does not allow to choose a seat.']
|
||||
try:
|
||||
seat = self.context['event'].seats.get(seat_guid=pos_data['seat'], subevent=pos_data.get('subevent'))
|
||||
except Seat.DoesNotExist:
|
||||
errs[i]['seat'] = ['The specified seat does not exist.']
|
||||
else:
|
||||
pos_data['seat'] = seat
|
||||
if (seat not in free_seats and not seat.is_available(sales_channel=validated_data.get('sales_channel', 'web'))) or seat in seats_seen:
|
||||
errs[i]['seat'] = [gettext_lazy('The selected seat "{seat}" is not available.').format(seat=seat.name)]
|
||||
seats_seen.add(seat)
|
||||
elif seated:
|
||||
errs[i]['seat'] = ['The specified product requires to choose a seat.']
|
||||
|
||||
requested_valid_from = pos_data.pop('requested_valid_from', None)
|
||||
if 'valid_from' not in pos_data and 'valid_until' not in pos_data:
|
||||
valid_from, valid_until = pos_data['item'].compute_validity(
|
||||
requested_start=(
|
||||
max(requested_valid_from, now())
|
||||
if requested_valid_from and pos_data['item'].validity_dynamic_start_choice
|
||||
else now()
|
||||
),
|
||||
enforce_start_limit=True,
|
||||
override_tz=self.context['event'].timezone,
|
||||
)
|
||||
pos_data['valid_from'] = valid_from
|
||||
pos_data['valid_until'] = valid_until
|
||||
|
||||
if not force:
|
||||
for i, pos_data in enumerate(positions_data):
|
||||
if pos_data.get('voucher'):
|
||||
price_after_voucher = pos_data.get('voucher').calculate_price(listed_price)
|
||||
if pos_data['voucher'].allow_ignore_quota or pos_data['voucher'].block_quota:
|
||||
continue
|
||||
|
||||
if pos_data.get('subevent'):
|
||||
if pos_data.get('item').pk in pos_data['subevent'].item_overrides and pos_data['subevent'].item_overrides[pos_data['item'].pk].disabled:
|
||||
errs[i]['item'] = [gettext_lazy('The product "{}" is not available on this date.').format(
|
||||
str(pos_data.get('item'))
|
||||
)]
|
||||
if (
|
||||
pos_data.get('variation') and pos_data['variation'].pk in pos_data['subevent'].var_overrides and
|
||||
pos_data['subevent'].var_overrides[pos_data['variation'].pk].disabled
|
||||
):
|
||||
errs[i]['item'] = [gettext_lazy('The product "{}" is not available on this date.').format(
|
||||
str(pos_data.get('item'))
|
||||
)]
|
||||
|
||||
new_quotas = pos_data['_quotas']
|
||||
if len(new_quotas) == 0:
|
||||
errs[i]['item'] = [gettext_lazy('The product "{}" is not assigned to a quota.').format(
|
||||
str(pos_data.get('item'))
|
||||
)]
|
||||
else:
|
||||
for quota in new_quotas:
|
||||
if quota not in quota_avail_cache:
|
||||
quota_avail_cache[quota] = list(quota.availability())
|
||||
|
||||
if quota_avail_cache[quota][1] is not None:
|
||||
quota_avail_cache[quota][1] -= 1
|
||||
if quota_avail_cache[quota][1] < 0:
|
||||
errs[i]['item'] = [
|
||||
gettext_lazy('There is not enough quota available on quota "{}" to perform the operation.').format(
|
||||
quota.name
|
||||
)
|
||||
]
|
||||
|
||||
if any(errs):
|
||||
raise ValidationError({'positions': errs})
|
||||
|
||||
if validated_data.get('locale', None) is None:
|
||||
validated_data['locale'] = self.context['event'].settings.locale
|
||||
order = Order(event=self.context['event'], **validated_data)
|
||||
order.set_expires(subevents=[p.get('subevent') for p in positions_data])
|
||||
order.meta_info = "{}"
|
||||
order.total = Decimal('0.00')
|
||||
if validated_data.get('require_approval') is not None:
|
||||
order.require_approval = validated_data['require_approval']
|
||||
if simulate:
|
||||
order = WrappedModel(order)
|
||||
order.last_modified = now()
|
||||
order.code = 'PREVIEW'
|
||||
else:
|
||||
order.save()
|
||||
|
||||
if ia:
|
||||
if not simulate:
|
||||
ia.order = order
|
||||
ia.save()
|
||||
else:
|
||||
order.invoice_address = ia
|
||||
ia.last_modified = now()
|
||||
|
||||
# Generate position objects
|
||||
pos_map = {}
|
||||
for pos_data in positions_data:
|
||||
addon_to = pos_data.pop('addon_to', None)
|
||||
attendee_name = pos_data.pop('attendee_name', '')
|
||||
if attendee_name and not pos_data.get('attendee_name_parts'):
|
||||
pos_data['attendee_name_parts'] = {
|
||||
'_legacy': attendee_name
|
||||
}
|
||||
pos = OrderPosition(**{k: v for k, v in pos_data.items() if k != 'answers' and k != '_quotas' and k != 'use_reusable_medium'})
|
||||
if simulate:
|
||||
pos.order = order._wrapped
|
||||
else:
|
||||
pos.order = order
|
||||
if addon_to:
|
||||
if simulate:
|
||||
pos.addon_to = pos_map[addon_to]
|
||||
else:
|
||||
pos.addon_to = pos_map[addon_to]
|
||||
|
||||
pos_map[pos.positionid] = pos
|
||||
pos_data['__instance'] = pos
|
||||
|
||||
# Calculate prices if not set
|
||||
for pos_data in positions_data:
|
||||
pos = pos_data['__instance']
|
||||
if pos.addon_to_id and is_included_for_free(pos.item, pos.addon_to):
|
||||
listed_price = Decimal('0.00')
|
||||
else:
|
||||
listed_price = get_listed_price(pos.item, pos.variation, pos.subevent)
|
||||
|
||||
if pos.price is None:
|
||||
if pos.voucher:
|
||||
price_after_voucher = pos.voucher.calculate_price(listed_price)
|
||||
else:
|
||||
price_after_voucher = listed_price
|
||||
if price is None:
|
||||
price = price_after_voucher
|
||||
|
||||
if v not in v_budget:
|
||||
v_budget[v] = v.budget - v.budget_used()
|
||||
disc = max(listed_price - price, 0)
|
||||
if disc > v_budget[v]:
|
||||
new_disc = v_budget[v]
|
||||
v_budget[v] -= new_disc
|
||||
if new_disc == Decimal('0.00') or pos_data.get('price') is not None:
|
||||
errs[i]['voucher'] = [
|
||||
'The voucher has a remaining budget of {}, therefore a discount of {} can not be '
|
||||
'given.'.format(v_budget[v] + new_disc, disc)
|
||||
]
|
||||
continue
|
||||
pos_data['price'] = price + (disc - new_disc)
|
||||
else:
|
||||
v_budget[v] -= disc
|
||||
|
||||
seated = pos_data.get('item').seat_category_mappings.filter(subevent=pos_data.get('subevent')).exists()
|
||||
if pos_data.get('seat'):
|
||||
if pos_data.get('addon_to'):
|
||||
errs[i]['seat'] = ['Seats are currently not supported for add-on products.']
|
||||
continue
|
||||
if not seated:
|
||||
errs[i]['seat'] = ['The specified product does not allow to choose a seat.']
|
||||
seat = pos_data['seat']
|
||||
if seat is Seat.DoesNotExist:
|
||||
errs[i]['seat'] = ['The specified seat does not exist.']
|
||||
else:
|
||||
seat_usage[seat] += 1
|
||||
if (seat_usage[seat] > 0 and not seat.is_available(sales_channel=validated_data.get('sales_channel', 'web'))) or seat_usage[seat] > 1:
|
||||
errs[i]['seat'] = [gettext_lazy('The selected seat "{seat}" is not available.').format(seat=seat.name)]
|
||||
elif seated:
|
||||
errs[i]['seat'] = ['The specified product requires to choose a seat.']
|
||||
|
||||
requested_valid_from = pos_data.pop('requested_valid_from', None)
|
||||
if 'valid_from' not in pos_data and 'valid_until' not in pos_data:
|
||||
valid_from, valid_until = pos_data['item'].compute_validity(
|
||||
requested_start=(
|
||||
max(requested_valid_from, now())
|
||||
if requested_valid_from and pos_data['item'].validity_dynamic_start_choice
|
||||
else now()
|
||||
),
|
||||
enforce_start_limit=True,
|
||||
override_tz=self.context['event'].timezone,
|
||||
)
|
||||
pos_data['valid_from'] = valid_from
|
||||
pos_data['valid_until'] = valid_until
|
||||
|
||||
if not force:
|
||||
for i, pos_data in enumerate(positions_data):
|
||||
if pos_data.get('voucher'):
|
||||
if pos_data['voucher'].allow_ignore_quota or pos_data['voucher'].block_quota:
|
||||
continue
|
||||
|
||||
if pos_data.get('subevent'):
|
||||
if pos_data.get('item').pk in pos_data['subevent'].item_overrides and pos_data['subevent'].item_overrides[pos_data['item'].pk].disabled:
|
||||
errs[i]['item'] = [gettext_lazy('The product "{}" is not available on this date.').format(
|
||||
str(pos_data.get('item'))
|
||||
)]
|
||||
if (
|
||||
pos_data.get('variation') and pos_data['variation'].pk in pos_data['subevent'].var_overrides and
|
||||
pos_data['subevent'].var_overrides[pos_data['variation'].pk].disabled
|
||||
):
|
||||
errs[i]['item'] = [gettext_lazy('The product "{}" is not available on this date.').format(
|
||||
str(pos_data.get('item'))
|
||||
)]
|
||||
|
||||
new_quotas = quotas_by_item[pos_data.get('item'), pos_data.get('variation'), pos_data.get('subevent')]
|
||||
if len(new_quotas) == 0:
|
||||
errs[i]['item'] = [gettext_lazy('The product "{}" is not assigned to a quota.').format(
|
||||
str(pos_data.get('item'))
|
||||
)]
|
||||
else:
|
||||
for quota in new_quotas:
|
||||
quota_usage[quota] += 1
|
||||
if quota_usage[quota] > 0 and qa.results[quota][1] is not None:
|
||||
if qa.results[quota][1] < quota_usage[quota]:
|
||||
errs[i]['item'] = [
|
||||
gettext_lazy('There is not enough quota available on quota "{}" to perform the operation.').format(
|
||||
quota.name
|
||||
)
|
||||
]
|
||||
|
||||
if any(errs):
|
||||
raise ValidationError({'positions': errs})
|
||||
|
||||
if validated_data.get('locale', None) is None:
|
||||
validated_data['locale'] = self.context['event'].settings.locale
|
||||
order = Order(event=self.context['event'], **validated_data)
|
||||
order.set_expires(subevents=[p.get('subevent') for p in positions_data])
|
||||
order.meta_info = "{}"
|
||||
order.total = Decimal('0.00')
|
||||
if validated_data.get('require_approval') is not None:
|
||||
order.require_approval = validated_data['require_approval']
|
||||
if simulate:
|
||||
order = WrappedModel(order)
|
||||
order.last_modified = now()
|
||||
order.code = 'PREVIEW'
|
||||
else:
|
||||
order.save()
|
||||
|
||||
if ia:
|
||||
if not simulate:
|
||||
ia.order = order
|
||||
ia.save()
|
||||
else:
|
||||
order.invoice_address = ia
|
||||
ia.last_modified = now()
|
||||
|
||||
# Generate position objects
|
||||
pos_map = {}
|
||||
for pos_data in positions_data:
|
||||
addon_to = pos_data.pop('addon_to', None)
|
||||
attendee_name = pos_data.pop('attendee_name', '')
|
||||
if attendee_name and not pos_data.get('attendee_name_parts'):
|
||||
pos_data['attendee_name_parts'] = {
|
||||
'_legacy': attendee_name
|
||||
}
|
||||
pos = OrderPosition(**{k: v for k, v in pos_data.items() if k != 'answers' and k != '_quotas' and k != 'use_reusable_medium'})
|
||||
if simulate:
|
||||
pos.order = order._wrapped
|
||||
else:
|
||||
pos.order = order
|
||||
if addon_to:
|
||||
if simulate:
|
||||
pos.addon_to = pos_map[addon_to]
|
||||
else:
|
||||
pos.addon_to = pos_map[addon_to]
|
||||
|
||||
pos_map[pos.positionid] = pos
|
||||
pos_data['__instance'] = pos
|
||||
|
||||
# Calculate prices if not set
|
||||
for pos_data in positions_data:
|
||||
pos = pos_data['__instance']
|
||||
if pos.addon_to_id and is_included_for_free(pos.item, pos.addon_to):
|
||||
listed_price = Decimal('0.00')
|
||||
else:
|
||||
listed_price = get_listed_price(pos.item, pos.variation, pos.subevent)
|
||||
|
||||
if pos.price is None:
|
||||
if pos.voucher:
|
||||
price_after_voucher = pos.voucher.calculate_price(listed_price)
|
||||
else:
|
||||
price_after_voucher = listed_price
|
||||
|
||||
line_price = get_line_price(
|
||||
price_after_voucher=price_after_voucher,
|
||||
custom_price_input=None,
|
||||
custom_price_input_is_net=False,
|
||||
tax_rule=pos.item.tax_rule,
|
||||
invoice_address=ia,
|
||||
bundled_sum=Decimal('0.00'),
|
||||
)
|
||||
pos.price = line_price.gross
|
||||
pos._auto_generated_price = True
|
||||
else:
|
||||
if pos.voucher:
|
||||
if not pos.item.tax_rule or pos.item.tax_rule.price_includes_tax:
|
||||
price_after_voucher = max(pos.price, pos.voucher.calculate_price(listed_price))
|
||||
else:
|
||||
price_after_voucher = max(pos.price - pos.tax_value, pos.voucher.calculate_price(listed_price))
|
||||
else:
|
||||
price_after_voucher = listed_price
|
||||
pos._auto_generated_price = False
|
||||
pos._voucher_discount = listed_price - price_after_voucher
|
||||
if pos.voucher:
|
||||
pos.voucher_budget_use = max(listed_price - price_after_voucher, Decimal('0.00'))
|
||||
|
||||
order_positions = [pos_data['__instance'] for pos_data in positions_data]
|
||||
discount_results = apply_discounts(
|
||||
self.context['event'],
|
||||
order.sales_channel,
|
||||
[
|
||||
(cp.item_id, cp.subevent_id, cp.price, bool(cp.addon_to), cp.is_bundled, pos._voucher_discount)
|
||||
for cp in order_positions
|
||||
]
|
||||
)
|
||||
for cp, (new_price, discount) in zip(order_positions, discount_results):
|
||||
if new_price != pos.price and pos._auto_generated_price:
|
||||
pos.price = new_price
|
||||
pos.discount = discount
|
||||
|
||||
# Save instances
|
||||
for pos_data in positions_data:
|
||||
answers_data = pos_data.pop('answers', [])
|
||||
use_reusable_medium = pos_data.pop('use_reusable_medium', None)
|
||||
pos = pos_data['__instance']
|
||||
pos._calculate_tax()
|
||||
|
||||
if simulate:
|
||||
pos = WrappedModel(pos)
|
||||
pos.id = 0
|
||||
answers = []
|
||||
for answ_data in answers_data:
|
||||
options = answ_data.pop('options', [])
|
||||
answ = WrappedModel(QuestionAnswer(**answ_data))
|
||||
answ.options = WrappedList(options)
|
||||
answers.append(answ)
|
||||
pos.answers = answers
|
||||
pos.pseudonymization_id = "PREVIEW"
|
||||
pos.checkins = []
|
||||
pos_map[pos.positionid] = pos
|
||||
else:
|
||||
if pos.voucher:
|
||||
Voucher.objects.filter(pk=pos.voucher.pk).update(redeemed=F('redeemed') + 1)
|
||||
pos.save()
|
||||
seen_answers = set()
|
||||
for answ_data in answers_data:
|
||||
# Workaround for a pretixPOS bug :-(
|
||||
if answ_data.get('question') in seen_answers:
|
||||
continue
|
||||
seen_answers.add(answ_data.get('question'))
|
||||
|
||||
options = answ_data.pop('options', [])
|
||||
|
||||
if isinstance(answ_data['answer'], File):
|
||||
an = answ_data.pop('answer')
|
||||
answ = pos.answers.create(**answ_data, answer='')
|
||||
answ.file.save(os.path.basename(an.name), an, save=False)
|
||||
answ.answer = 'file://' + answ.file.name
|
||||
answ.save()
|
||||
else:
|
||||
answ = pos.answers.create(**answ_data)
|
||||
answ.options.add(*options)
|
||||
|
||||
if use_reusable_medium:
|
||||
use_reusable_medium.linked_orderposition = pos
|
||||
use_reusable_medium.save(update_fields=['linked_orderposition'])
|
||||
use_reusable_medium.log_action(
|
||||
'pretix.reusable_medium.linked_orderposition.changed',
|
||||
data={
|
||||
'by_order': order.code,
|
||||
'linked_orderposition': pos.pk,
|
||||
}
|
||||
line_price = get_line_price(
|
||||
price_after_voucher=price_after_voucher,
|
||||
custom_price_input=None,
|
||||
custom_price_input_is_net=False,
|
||||
tax_rule=pos.item.tax_rule,
|
||||
invoice_address=ia,
|
||||
bundled_sum=Decimal('0.00'),
|
||||
)
|
||||
pos.price = line_price.gross
|
||||
pos._auto_generated_price = True
|
||||
else:
|
||||
if pos.voucher:
|
||||
if not pos.item.tax_rule or pos.item.tax_rule.price_includes_tax:
|
||||
price_after_voucher = max(pos.price, pos.voucher.calculate_price(listed_price))
|
||||
else:
|
||||
price_after_voucher = max(pos.price - pos.tax_value, pos.voucher.calculate_price(listed_price))
|
||||
else:
|
||||
price_after_voucher = listed_price
|
||||
pos._auto_generated_price = False
|
||||
pos._voucher_discount = listed_price - price_after_voucher
|
||||
if pos.voucher:
|
||||
pos.voucher_budget_use = max(listed_price - price_after_voucher, Decimal('0.00'))
|
||||
|
||||
if not simulate:
|
||||
for cp in delete_cps:
|
||||
if cp.addon_to_id:
|
||||
continue
|
||||
cp.addons.all().delete()
|
||||
cp.delete()
|
||||
order_positions = [pos_data['__instance'] for pos_data in positions_data]
|
||||
discount_results = apply_discounts(
|
||||
self.context['event'],
|
||||
order.sales_channel,
|
||||
[
|
||||
(cp.item_id, cp.subevent_id, cp.price, bool(cp.addon_to), cp.is_bundled, pos._voucher_discount)
|
||||
for cp in order_positions
|
||||
]
|
||||
)
|
||||
for cp, (new_price, discount) in zip(order_positions, discount_results):
|
||||
if new_price != pos.price and pos._auto_generated_price:
|
||||
pos.price = new_price
|
||||
pos.discount = discount
|
||||
|
||||
# Save instances
|
||||
for pos_data in positions_data:
|
||||
answers_data = pos_data.pop('answers', [])
|
||||
use_reusable_medium = pos_data.pop('use_reusable_medium', None)
|
||||
pos = pos_data['__instance']
|
||||
pos._calculate_tax()
|
||||
|
||||
if simulate:
|
||||
pos = WrappedModel(pos)
|
||||
pos.id = 0
|
||||
answers = []
|
||||
for answ_data in answers_data:
|
||||
options = answ_data.pop('options', [])
|
||||
answ = WrappedModel(QuestionAnswer(**answ_data))
|
||||
answ.options = WrappedList(options)
|
||||
answers.append(answ)
|
||||
pos.answers = answers
|
||||
pos.pseudonymization_id = "PREVIEW"
|
||||
pos.checkins = []
|
||||
pos_map[pos.positionid] = pos
|
||||
else:
|
||||
if pos.voucher:
|
||||
Voucher.objects.filter(pk=pos.voucher.pk).update(redeemed=F('redeemed') + 1)
|
||||
pos.save()
|
||||
seen_answers = set()
|
||||
for answ_data in answers_data:
|
||||
# Workaround for a pretixPOS bug :-(
|
||||
if answ_data.get('question') in seen_answers:
|
||||
continue
|
||||
seen_answers.add(answ_data.get('question'))
|
||||
|
||||
options = answ_data.pop('options', [])
|
||||
|
||||
if isinstance(answ_data['answer'], File):
|
||||
an = answ_data.pop('answer')
|
||||
answ = pos.answers.create(**answ_data, answer='')
|
||||
answ.file.save(os.path.basename(an.name), an, save=False)
|
||||
answ.answer = 'file://' + answ.file.name
|
||||
answ.save()
|
||||
else:
|
||||
answ = pos.answers.create(**answ_data)
|
||||
answ.options.add(*options)
|
||||
|
||||
if use_reusable_medium:
|
||||
use_reusable_medium.linked_orderposition = pos
|
||||
use_reusable_medium.save(update_fields=['linked_orderposition'])
|
||||
use_reusable_medium.log_action(
|
||||
'pretix.reusable_medium.linked_orderposition.changed',
|
||||
data={
|
||||
'by_order': order.code,
|
||||
'linked_orderposition': pos.pk,
|
||||
}
|
||||
)
|
||||
|
||||
if not simulate:
|
||||
for cp in delete_cps:
|
||||
if cp.addon_to_id:
|
||||
continue
|
||||
cp.addons.all().delete()
|
||||
cp.delete()
|
||||
|
||||
order.total = sum([p.price for p in pos_map.values()])
|
||||
fees = []
|
||||
@@ -1626,7 +1512,6 @@ class InlineInvoiceLineSerializer(I18nAwareModelSerializer):
|
||||
|
||||
|
||||
class InvoiceSerializer(I18nAwareModelSerializer):
|
||||
event = SlugRelatedField(slug_field='slug', read_only=True)
|
||||
order = serializers.SlugRelatedField(slug_field='code', read_only=True)
|
||||
refers = serializers.SlugRelatedField(slug_field='full_invoice_no', read_only=True)
|
||||
lines = InlineInvoiceLineSerializer(many=True)
|
||||
@@ -1635,7 +1520,7 @@ class InvoiceSerializer(I18nAwareModelSerializer):
|
||||
|
||||
class Meta:
|
||||
model = Invoice
|
||||
fields = ('event', 'order', 'number', 'is_cancellation', 'invoice_from', 'invoice_from_name', 'invoice_from_zipcode',
|
||||
fields = ('order', 'number', 'is_cancellation', 'invoice_from', 'invoice_from_name', 'invoice_from_zipcode',
|
||||
'invoice_from_city', 'invoice_from_country', 'invoice_from_tax_id', 'invoice_from_vat_id',
|
||||
'invoice_to', 'invoice_to_company', 'invoice_to_name', 'invoice_to_street', 'invoice_to_zipcode',
|
||||
'invoice_to_city', 'invoice_to_state', 'invoice_to_country', 'invoice_to_vat_id', 'invoice_to_beneficiary',
|
||||
|
||||
@@ -94,14 +94,6 @@ class CustomerSerializer(I18nAwareModelSerializer):
|
||||
data['name_parts']['_scheme'] = self.context['request'].organizer.settings.name_scheme
|
||||
return data
|
||||
|
||||
def validate_email(self, value):
|
||||
qs = Customer.objects.filter(organizer=self.context['organizer'], email__iexact=value)
|
||||
if self.instance and self.instance.pk:
|
||||
qs = qs.exclude(pk=self.instance.pk)
|
||||
if qs.exists():
|
||||
raise ValidationError(_("An account with this email address is already registered."))
|
||||
return value
|
||||
|
||||
|
||||
class CustomerCreateSerializer(CustomerSerializer):
|
||||
send_email = serializers.BooleanField(default=False, required=False, allow_null=True)
|
||||
@@ -259,8 +251,6 @@ class DeviceSerializer(serializers.ModelSerializer):
|
||||
unique_serial = serializers.CharField(read_only=True)
|
||||
hardware_brand = serializers.CharField(read_only=True)
|
||||
hardware_model = serializers.CharField(read_only=True)
|
||||
os_name = serializers.CharField(read_only=True)
|
||||
os_version = serializers.CharField(read_only=True)
|
||||
software_brand = serializers.CharField(read_only=True)
|
||||
software_version = serializers.CharField(read_only=True)
|
||||
created = serializers.DateTimeField(read_only=True)
|
||||
@@ -273,7 +263,7 @@ class DeviceSerializer(serializers.ModelSerializer):
|
||||
fields = (
|
||||
'device_id', 'unique_serial', 'initialization_token', 'all_events', 'limit_events',
|
||||
'revoked', 'name', 'created', 'initialized', 'hardware_brand', 'hardware_model',
|
||||
'os_name', 'os_version', 'software_brand', 'software_version', 'security_profile'
|
||||
'software_brand', 'software_version', 'security_profile'
|
||||
)
|
||||
|
||||
|
||||
@@ -400,9 +390,6 @@ class OrganizerSettingsSerializer(SettingsSerializer):
|
||||
'reusable_media_type_nfc_uid',
|
||||
'reusable_media_type_nfc_uid_autocreate_giftcard',
|
||||
'reusable_media_type_nfc_uid_autocreate_giftcard_currency',
|
||||
'reusable_media_type_nfc_mf0aes',
|
||||
'reusable_media_type_nfc_mf0aes_autocreate_giftcard',
|
||||
'reusable_media_type_nfc_mf0aes_autocreate_giftcard_currency',
|
||||
]
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
|
||||
@@ -94,13 +94,8 @@ class VoucherSerializer(I18nAwareModelSerializer):
|
||||
)
|
||||
if check_quota:
|
||||
Voucher.clean_quota_check(
|
||||
full_data,
|
||||
full_data.get('max_usages', 1) - (self.instance.redeemed if self.instance else 0),
|
||||
self.instance,
|
||||
self.context.get('event'),
|
||||
full_data.get('quota'),
|
||||
full_data.get('item'),
|
||||
full_data.get('variation')
|
||||
full_data, 1, self.instance, self.context.get('event'),
|
||||
full_data.get('quota'), full_data.get('item'), full_data.get('variation')
|
||||
)
|
||||
Voucher.clean_voucher_code(full_data, self.context.get('event'), self.instance.pk if self.instance else None)
|
||||
|
||||
|
||||
@@ -61,8 +61,6 @@ orga_router.register(r'membershiptypes', organizer.MembershipTypeViewSet)
|
||||
orga_router.register(r'reusablemedia', media.ReusableMediaViewSet)
|
||||
orga_router.register(r'teams', organizer.TeamViewSet)
|
||||
orga_router.register(r'devices', organizer.DeviceViewSet)
|
||||
orga_router.register(r'orders', order.OrganizerOrderViewSet)
|
||||
orga_router.register(r'invoices', order.InvoiceViewSet)
|
||||
orga_router.register(r'exporters', exporters.OrganizerExportersViewSet, basename='exporters')
|
||||
|
||||
team_router = routers.DefaultRouter()
|
||||
@@ -79,7 +77,7 @@ event_router.register(r'questions', item.QuestionViewSet)
|
||||
event_router.register(r'discounts', discount.DiscountViewSet)
|
||||
event_router.register(r'quotas', item.QuotaViewSet)
|
||||
event_router.register(r'vouchers', voucher.VoucherViewSet)
|
||||
event_router.register(r'orders', order.EventOrderViewSet)
|
||||
event_router.register(r'orders', order.OrderViewSet)
|
||||
event_router.register(r'orderpositions', order.OrderPositionViewSet)
|
||||
event_router.register(r'invoices', order.InvoiceViewSet)
|
||||
event_router.register(r'revokedsecrets', order.RevokedSecretViewSet, basename='revokedsecrets')
|
||||
|
||||
@@ -25,7 +25,6 @@ from typing import List
|
||||
from django.db import transaction
|
||||
from django.utils.crypto import get_random_string
|
||||
from django.utils.functional import cached_property
|
||||
from django.utils.timezone import now
|
||||
from django.utils.translation import gettext as _
|
||||
from rest_framework import status, viewsets
|
||||
from rest_framework.decorators import action
|
||||
@@ -42,7 +41,7 @@ from pretix.base.models import CartPosition
|
||||
from pretix.base.services.cart import (
|
||||
_get_quota_availability, _get_voucher_availability, error_messages,
|
||||
)
|
||||
from pretix.base.services.locking import lock_objects
|
||||
from pretix.base.services.locking import NoLockManager
|
||||
|
||||
|
||||
class CartPositionViewSet(CreateModelMixin, DestroyModelMixin, viewsets.ReadOnlyModelViewSet):
|
||||
@@ -151,21 +150,12 @@ class CartPositionViewSet(CreateModelMixin, DestroyModelMixin, viewsets.ReadOnly
|
||||
quota_diff[q] += 1
|
||||
|
||||
seats_seen = set()
|
||||
now_dt = now()
|
||||
with transaction.atomic():
|
||||
full_lock_required = seat_diff and self.request.event.settings.seating_minimal_distance > 0
|
||||
if full_lock_required:
|
||||
# We lock the entire event in this case since we don't want to deal with fine-granular locking
|
||||
# in the case of seating distance enforcement
|
||||
lock_objects([self.request.event])
|
||||
else:
|
||||
lock_objects(
|
||||
[q for q, d in quota_diff.items() if q.size is not None and d > 0] +
|
||||
[v for v, d in voucher_use_diff.items() if d > 0] +
|
||||
[s for s, d in seat_diff.items() if d > 0],
|
||||
shared_lock_objects=[self.request.event]
|
||||
)
|
||||
|
||||
lockfn = NoLockManager
|
||||
if self._require_locking(quota_diff, voucher_use_diff, seat_diff):
|
||||
lockfn = self.request.event.lock
|
||||
|
||||
with lockfn() as now_dt, transaction.atomic():
|
||||
vouchers_ok, vouchers_depend_on_cart = _get_voucher_availability(
|
||||
self.request.event,
|
||||
voucher_use_diff,
|
||||
|
||||
@@ -164,21 +164,8 @@ class CheckinListViewSet(viewsets.ModelViewSet):
|
||||
secret=serializer.validated_data['raw_barcode']
|
||||
).first()
|
||||
|
||||
clist = self.get_object()
|
||||
if serializer.validated_data.get('nonce'):
|
||||
if kwargs.get('position'):
|
||||
prev = kwargs['position'].all_checkins.filter(nonce=serializer.validated_data['nonce']).first()
|
||||
else:
|
||||
prev = clist.checkins.filter(
|
||||
nonce=serializer.validated_data['nonce'],
|
||||
raw_barcode=serializer.validated_data['raw_barcode'],
|
||||
).first()
|
||||
if prev:
|
||||
# Ignore because nonce is already handled
|
||||
return Response(serializer.data, status=201)
|
||||
|
||||
c = serializer.save(
|
||||
list=clist,
|
||||
list=self.get_object(),
|
||||
successful=False,
|
||||
forced=True,
|
||||
force_sent=True,
|
||||
@@ -278,7 +265,6 @@ with scopes_disabled():
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
self.checkinlist = kwargs.pop('checkinlist')
|
||||
self.gate = kwargs.pop('gate')
|
||||
super().__init__(*args, **kwargs)
|
||||
|
||||
def has_checkin_qs(self, queryset, name, value):
|
||||
@@ -288,7 +274,7 @@ with scopes_disabled():
|
||||
if not self.checkinlist.rules:
|
||||
return queryset
|
||||
return queryset.filter(
|
||||
SQLLogic(self.checkinlist, self.gate).apply(self.checkinlist.rules)
|
||||
SQLLogic(self.checkinlist).apply(self.checkinlist.rules)
|
||||
).filter(
|
||||
Q(valid_from__isnull=True) | Q(valid_from__lte=now()),
|
||||
Q(valid_until__isnull=True) | Q(valid_until__gte=now()),
|
||||
@@ -410,7 +396,7 @@ def _checkin_list_position_queryset(checkinlists, ignore_status=False, ignore_pr
|
||||
|
||||
def _redeem_process(*, checkinlists, raw_barcode, answers_data, datetime, force, checkin_type, ignore_unpaid, nonce,
|
||||
untrusted_input, user, auth, expand, pdf_data, request, questions_supported, canceled_supported,
|
||||
source_type='barcode', legacy_url_support=False, simulate=False, gate=None):
|
||||
source_type='barcode', legacy_url_support=False, simulate=False):
|
||||
if not checkinlists:
|
||||
raise ValidationError('No check-in list passed.')
|
||||
|
||||
@@ -418,7 +404,7 @@ def _redeem_process(*, checkinlists, raw_barcode, answers_data, datetime, force,
|
||||
prefetch_related_objects([cl for cl in checkinlists if not cl.all_products], 'limit_products')
|
||||
|
||||
device = auth if isinstance(auth, Device) else None
|
||||
gate = gate or (auth.gate if isinstance(auth, Device) else None)
|
||||
gate = auth.gate if isinstance(auth, Device) else None
|
||||
|
||||
context = {
|
||||
'request': request,
|
||||
@@ -673,7 +659,6 @@ def _redeem_process(*, checkinlists, raw_barcode, answers_data, datetime, force,
|
||||
raw_source_type=source_type,
|
||||
from_revoked_secret=from_revoked_secret,
|
||||
simulate=simulate,
|
||||
gate=gate,
|
||||
)
|
||||
except RequiredQuestionsError as e:
|
||||
return Response({
|
||||
@@ -772,7 +757,6 @@ class CheckinListPositionViewSet(viewsets.ReadOnlyModelViewSet):
|
||||
def get_filterset_kwargs(self):
|
||||
return {
|
||||
'checkinlist': self.checkinlist,
|
||||
'gate': self.request.auth.gate if isinstance(self.request.auth, Device) else None,
|
||||
}
|
||||
|
||||
@cached_property
|
||||
|
||||
@@ -19,12 +19,8 @@
|
||||
# You should have received a copy of the GNU Affero General Public License along with this program. If not, see
|
||||
# <https://www.gnu.org/licenses/>.
|
||||
#
|
||||
import base64
|
||||
import logging
|
||||
|
||||
from cryptography.hazmat.backends.openssl.backend import Backend
|
||||
from cryptography.hazmat.primitives.asymmetric import padding
|
||||
from cryptography.hazmat.primitives.serialization import load_pem_public_key
|
||||
from django.db.models import Exists, OuterRef, Q
|
||||
from django.db.models.functions import Coalesce
|
||||
from django.utils.timezone import now
|
||||
@@ -38,8 +34,6 @@ from pretix.api.auth.device import DeviceTokenAuthentication
|
||||
from pretix.api.views.version import numeric_version
|
||||
from pretix.base.models import CheckinList, Device, SubEvent
|
||||
from pretix.base.models.devices import Gate, generate_api_token
|
||||
from pretix.base.models.media import MediumKeySet
|
||||
from pretix.base.services.media import get_keysets_for_organizer
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
@@ -48,73 +42,17 @@ class InitializationRequestSerializer(serializers.Serializer):
|
||||
token = serializers.CharField(max_length=190)
|
||||
hardware_brand = serializers.CharField(max_length=190)
|
||||
hardware_model = serializers.CharField(max_length=190)
|
||||
os_name = serializers.CharField(max_length=190, required=False, allow_null=True)
|
||||
os_version = serializers.CharField(max_length=190, required=False, allow_null=True)
|
||||
software_brand = serializers.CharField(max_length=190)
|
||||
software_version = serializers.CharField(max_length=190)
|
||||
info = serializers.JSONField(required=False, allow_null=True)
|
||||
rsa_pubkey = serializers.CharField(required=False, allow_null=True)
|
||||
|
||||
def validate(self, attrs):
|
||||
if attrs.get('rsa_pubkey'):
|
||||
try:
|
||||
load_pem_public_key(
|
||||
attrs['rsa_pubkey'].encode(), Backend()
|
||||
)
|
||||
except:
|
||||
raise ValidationError({'rsa_pubkey': ['Not a valid public key.']})
|
||||
return attrs
|
||||
|
||||
|
||||
class UpdateRequestSerializer(serializers.Serializer):
|
||||
hardware_brand = serializers.CharField(max_length=190)
|
||||
hardware_model = serializers.CharField(max_length=190)
|
||||
os_name = serializers.CharField(max_length=190, required=False, allow_null=True)
|
||||
os_version = serializers.CharField(max_length=190, required=False, allow_null=True)
|
||||
software_brand = serializers.CharField(max_length=190)
|
||||
software_version = serializers.CharField(max_length=190)
|
||||
info = serializers.JSONField(required=False, allow_null=True)
|
||||
rsa_pubkey = serializers.CharField(required=False, allow_null=True)
|
||||
|
||||
def validate(self, attrs):
|
||||
if attrs.get('rsa_pubkey'):
|
||||
try:
|
||||
load_pem_public_key(
|
||||
attrs['rsa_pubkey'].encode(), Backend()
|
||||
)
|
||||
except:
|
||||
raise ValidationError({'rsa_pubkey': ['Not a valid public key.']})
|
||||
return attrs
|
||||
|
||||
|
||||
class RSAEncryptedField(serializers.Field):
|
||||
def to_representation(self, value):
|
||||
public_key = load_pem_public_key(
|
||||
self.context['device'].rsa_pubkey.encode(), Backend()
|
||||
)
|
||||
cipher_text = public_key.encrypt(
|
||||
# RSA/ECB/PKCS1Padding
|
||||
value,
|
||||
padding.PKCS1v15()
|
||||
)
|
||||
return base64.b64encode(cipher_text).decode()
|
||||
|
||||
|
||||
class MediumKeySetSerializer(serializers.ModelSerializer):
|
||||
uid_key = RSAEncryptedField(read_only=True)
|
||||
diversification_key = RSAEncryptedField(read_only=True)
|
||||
organizer = serializers.SlugRelatedField(slug_field='slug', read_only=True)
|
||||
|
||||
class Meta:
|
||||
model = MediumKeySet
|
||||
fields = [
|
||||
'public_id',
|
||||
'organizer',
|
||||
'active',
|
||||
'media_type',
|
||||
'uid_key',
|
||||
'diversification_key',
|
||||
]
|
||||
|
||||
|
||||
class GateSerializer(serializers.ModelSerializer):
|
||||
@@ -161,12 +99,9 @@ class InitializeView(APIView):
|
||||
device.initialized = now()
|
||||
device.hardware_brand = serializer.validated_data.get('hardware_brand')
|
||||
device.hardware_model = serializer.validated_data.get('hardware_model')
|
||||
device.os_name = serializer.validated_data.get('os_name')
|
||||
device.os_version = serializer.validated_data.get('os_version')
|
||||
device.software_brand = serializer.validated_data.get('software_brand')
|
||||
device.software_version = serializer.validated_data.get('software_version')
|
||||
device.info = serializer.validated_data.get('info')
|
||||
device.rsa_pubkey = serializer.validated_data.get('rsa_pubkey')
|
||||
device.api_token = generate_api_token()
|
||||
device.save()
|
||||
|
||||
@@ -185,15 +120,8 @@ class UpdateView(APIView):
|
||||
device = request.auth
|
||||
device.hardware_brand = serializer.validated_data.get('hardware_brand')
|
||||
device.hardware_model = serializer.validated_data.get('hardware_model')
|
||||
device.os_name = serializer.validated_data.get('os_name')
|
||||
device.os_version = serializer.validated_data.get('os_version')
|
||||
device.software_brand = serializer.validated_data.get('software_brand')
|
||||
device.software_version = serializer.validated_data.get('software_version')
|
||||
if serializer.validated_data.get('rsa_pubkey') and serializer.validated_data.get('rsa_pubkey') != device.rsa_pubkey:
|
||||
if device.rsa_pubkey:
|
||||
raise ValidationError({'rsa_pubkey': ['You cannot change the rsa_pubkey of the device once it is set.']})
|
||||
else:
|
||||
device.rsa_pubkey = serializer.validated_data.get('rsa_pubkey')
|
||||
device.info = serializer.validated_data.get('info')
|
||||
device.save()
|
||||
device.log_action('pretix.device.updated', data=serializer.validated_data, auth=device)
|
||||
@@ -241,12 +169,8 @@ class InfoView(APIView):
|
||||
'pretix': __version__,
|
||||
'pretix_numeric': numeric_version(__version__),
|
||||
}
|
||||
},
|
||||
'medium_key_sets': MediumKeySetSerializer(
|
||||
get_keysets_for_organizer(device.organizer),
|
||||
many=True,
|
||||
context={'device': request.auth}
|
||||
).data if device.rsa_pubkey else []
|
||||
}
|
||||
|
||||
})
|
||||
|
||||
|
||||
|
||||
@@ -381,29 +381,16 @@ with scopes_disabled():
|
||||
| Q(location__icontains=i18ncomp(value))
|
||||
)
|
||||
|
||||
class OrganizerSubEventFilter(SubEventFilter):
|
||||
def search_qs(self, queryset, name, value):
|
||||
return queryset.filter(
|
||||
Q(name__icontains=i18ncomp(value))
|
||||
| Q(event__slug__icontains=value)
|
||||
| Q(location__icontains=i18ncomp(value))
|
||||
)
|
||||
|
||||
|
||||
class SubEventViewSet(ConditionalListView, viewsets.ModelViewSet):
|
||||
serializer_class = SubEventSerializer
|
||||
queryset = SubEvent.objects.none()
|
||||
write_permission = 'can_change_event_settings'
|
||||
filter_backends = (DjangoFilterBackend, TotalOrderingFilter)
|
||||
filterset_class = SubEventFilter
|
||||
ordering = ('date_from',)
|
||||
ordering_fields = ('id', 'date_from', 'last_modified')
|
||||
|
||||
@property
|
||||
def filterset_class(self):
|
||||
if getattr(self.request, 'event', None):
|
||||
return SubEventFilter
|
||||
return OrganizerSubEventFilter
|
||||
|
||||
def get_queryset(self):
|
||||
if getattr(self.request, 'event', None):
|
||||
qs = self.request.event.subevents
|
||||
@@ -428,7 +415,6 @@ class SubEventViewSet(ConditionalListView, viewsets.ModelViewSet):
|
||||
'subeventitem_set',
|
||||
'subeventitemvariation_set',
|
||||
'meta_values',
|
||||
'meta_values__property',
|
||||
Prefetch(
|
||||
'seat_category_mappings',
|
||||
to_attr='_seat_category_mappings',
|
||||
|
||||
@@ -104,12 +104,6 @@ class ReusableMediaViewSet(viewsets.ModelViewSet):
|
||||
auth=self.request.auth,
|
||||
data=merge_dicts(self.request.data, {'id': inst.pk})
|
||||
)
|
||||
mt = MEDIA_TYPES.get(serializer.validated_data["type"])
|
||||
if mt:
|
||||
m = mt.handle_new(self.request.organizer, inst, self.request.user, self.request.auth)
|
||||
if m:
|
||||
s = self.get_serializer(m)
|
||||
return Response({"result": s.data})
|
||||
|
||||
@transaction.atomic()
|
||||
def perform_update(self, serializer):
|
||||
|
||||
@@ -45,7 +45,6 @@ from rest_framework.exceptions import (
|
||||
APIException, NotFound, PermissionDenied, ValidationError,
|
||||
)
|
||||
from rest_framework.mixins import CreateModelMixin
|
||||
from rest_framework.permissions import SAFE_METHODS
|
||||
from rest_framework.response import Response
|
||||
|
||||
from pretix.api.models import OAuthAccessToken
|
||||
@@ -117,16 +116,12 @@ with scopes_disabled():
|
||||
|
||||
@scopes_disabled()
|
||||
def subevent_after_qs(self, qs, name, value):
|
||||
if getattr(self.request, 'event', None):
|
||||
subevents = self.request.event.subevents
|
||||
else:
|
||||
subevents = SubEvent.objects.filter(event__organizer=self.request.organizer)
|
||||
|
||||
qs = qs.filter(
|
||||
pk__in=Subquery(
|
||||
OrderPosition.all.filter(
|
||||
subevent_id__in=subevents.filter(
|
||||
subevent_id__in=SubEvent.objects.filter(
|
||||
Q(date_to__gt=value) | Q(date_from__gt=value, date_to__isnull=True),
|
||||
event=self.request.event
|
||||
).values_list('id'),
|
||||
).values_list('order_id')
|
||||
)
|
||||
@@ -134,16 +129,12 @@ with scopes_disabled():
|
||||
return qs
|
||||
|
||||
def subevent_before_qs(self, qs, name, value):
|
||||
if getattr(self.request, 'event', None):
|
||||
subevents = self.request.event.subevents
|
||||
else:
|
||||
subevents = SubEvent.objects.filter(event__organizer=self.request.organizer)
|
||||
|
||||
qs = qs.filter(
|
||||
pk__in=Subquery(
|
||||
OrderPosition.all.filter(
|
||||
subevent_id__in=subevents.filter(
|
||||
subevent_id__in=SubEvent.objects.filter(
|
||||
Q(date_from__lt=value),
|
||||
event=self.request.event
|
||||
).values_list('id'),
|
||||
).values_list('order_id')
|
||||
)
|
||||
@@ -195,7 +186,7 @@ with scopes_disabled():
|
||||
)
|
||||
|
||||
|
||||
class OrderViewSetMixin:
|
||||
class OrderViewSet(viewsets.ModelViewSet):
|
||||
serializer_class = OrderSerializer
|
||||
queryset = Order.objects.none()
|
||||
filter_backends = (DjangoFilterBackend, TotalOrderingFilter)
|
||||
@@ -203,12 +194,19 @@ class OrderViewSetMixin:
|
||||
ordering_fields = ('datetime', 'code', 'status', 'last_modified')
|
||||
filterset_class = OrderFilter
|
||||
lookup_field = 'code'
|
||||
permission = 'can_view_orders'
|
||||
write_permission = 'can_change_orders'
|
||||
|
||||
def get_base_queryset(self):
|
||||
raise NotImplementedError()
|
||||
def get_serializer_context(self):
|
||||
ctx = super().get_serializer_context()
|
||||
ctx['event'] = self.request.event
|
||||
ctx['pdf_data'] = self.request.query_params.get('pdf_data', 'false') == 'true'
|
||||
ctx['exclude'] = self.request.query_params.getlist('exclude')
|
||||
ctx['include'] = self.request.query_params.getlist('include')
|
||||
return ctx
|
||||
|
||||
def get_queryset(self):
|
||||
qs = self.get_base_queryset()
|
||||
qs = self.request.event.orders
|
||||
if 'fees' not in self.request.GET.getlist('exclude'):
|
||||
if self.request.query_params.get('include_canceled_fees', 'false') == 'true':
|
||||
fqs = OrderFee.all
|
||||
@@ -230,12 +228,11 @@ class OrderViewSetMixin:
|
||||
opq = OrderPosition.all
|
||||
else:
|
||||
opq = OrderPosition.objects
|
||||
if request.query_params.get('pdf_data', 'false') == 'true' and getattr(request, 'event', None):
|
||||
if request.query_params.get('pdf_data', 'false') == 'true':
|
||||
prefetch_related_objects([request.organizer], 'meta_properties')
|
||||
prefetch_related_objects(
|
||||
[request.event],
|
||||
Prefetch('meta_values', queryset=EventMetaValue.objects.select_related('property'),
|
||||
to_attr='meta_values_cached'),
|
||||
Prefetch('meta_values', queryset=EventMetaValue.objects.select_related('property'), to_attr='meta_values_cached'),
|
||||
'questions',
|
||||
'item_meta_properties',
|
||||
)
|
||||
@@ -270,12 +267,13 @@ class OrderViewSetMixin:
|
||||
)
|
||||
)
|
||||
|
||||
def get_serializer_context(self):
|
||||
ctx = super().get_serializer_context()
|
||||
ctx['exclude'] = self.request.query_params.getlist('exclude')
|
||||
ctx['include'] = self.request.query_params.getlist('include')
|
||||
ctx['pdf_data'] = False
|
||||
return ctx
|
||||
def _get_output_provider(self, identifier):
|
||||
responses = register_ticket_outputs.send(self.request.event)
|
||||
for receiver, response in responses:
|
||||
prov = response(self.request.event)
|
||||
if prov.identifier == identifier:
|
||||
return prov
|
||||
raise NotFound('Unknown output provider.')
|
||||
|
||||
@scopes_disabled() # we are sure enough that get_queryset() is correct, so we save some perforamnce
|
||||
def list(self, request, **kwargs):
|
||||
@@ -292,45 +290,6 @@ class OrderViewSetMixin:
|
||||
serializer = self.get_serializer(queryset, many=True)
|
||||
return Response(serializer.data, headers={'X-Page-Generated': date})
|
||||
|
||||
|
||||
class OrganizerOrderViewSet(OrderViewSetMixin, viewsets.ReadOnlyModelViewSet):
|
||||
def get_base_queryset(self):
|
||||
perm = "can_view_orders" if self.request.method in SAFE_METHODS else "can_change_orders"
|
||||
if isinstance(self.request.auth, (TeamAPIToken, Device)):
|
||||
return Order.objects.filter(
|
||||
event__organizer=self.request.organizer,
|
||||
event__in=self.request.auth.get_events_with_permission(perm, request=self.request)
|
||||
)
|
||||
elif self.request.user.is_authenticated:
|
||||
return Order.objects.filter(
|
||||
event__organizer=self.request.organizer,
|
||||
event__in=self.request.user.get_events_with_permission(perm, request=self.request)
|
||||
)
|
||||
else:
|
||||
raise PermissionDenied()
|
||||
|
||||
|
||||
class EventOrderViewSet(OrderViewSetMixin, viewsets.ModelViewSet):
|
||||
permission = 'can_view_orders'
|
||||
write_permission = 'can_change_orders'
|
||||
|
||||
def get_serializer_context(self):
|
||||
ctx = super().get_serializer_context()
|
||||
ctx['event'] = self.request.event
|
||||
ctx['pdf_data'] = self.request.query_params.get('pdf_data', 'false') == 'true'
|
||||
return ctx
|
||||
|
||||
def get_base_queryset(self):
|
||||
return self.request.event.orders
|
||||
|
||||
def _get_output_provider(self, identifier):
|
||||
responses = register_ticket_outputs.send(self.request.event)
|
||||
for receiver, response in responses:
|
||||
prov = response(self.request.event)
|
||||
if prov.identifier == identifier:
|
||||
return prov
|
||||
raise NotFound('Unknown output provider.')
|
||||
|
||||
@action(detail=True, url_name='download', url_path='download/(?P<output>[^/]+)')
|
||||
def download(self, request, output, **kwargs):
|
||||
provider = self._get_output_provider(output)
|
||||
@@ -987,7 +946,6 @@ with scopes_disabled():
|
||||
| Q(addon_to__attendee_email__icontains=value)
|
||||
| Q(order__code__istartswith=value)
|
||||
| Q(order__invoice_address__name_cached__icontains=value)
|
||||
| Q(order__invoice_address__company__icontains=value)
|
||||
| Q(order__email__icontains=value)
|
||||
| Q(pk__in=matching_media)
|
||||
)
|
||||
@@ -1824,24 +1782,11 @@ class InvoiceViewSet(viewsets.ReadOnlyModelViewSet):
|
||||
write_permission = 'can_change_orders'
|
||||
|
||||
def get_queryset(self):
|
||||
perm = "can_view_orders" if self.request.method in SAFE_METHODS else "can_change_orders"
|
||||
if getattr(self.request, 'event', None):
|
||||
qs = self.request.event.invoices
|
||||
elif isinstance(self.request.auth, (TeamAPIToken, Device)):
|
||||
qs = Invoice.objects.filter(
|
||||
event__organizer=self.request.organizer,
|
||||
event__in=self.request.auth.get_events_with_permission(perm, request=self.request)
|
||||
)
|
||||
elif self.request.user.is_authenticated:
|
||||
qs = Invoice.objects.filter(
|
||||
event__organizer=self.request.organizer,
|
||||
event__in=self.request.user.get_events_with_permission(perm, request=self.request)
|
||||
)
|
||||
return qs.prefetch_related('lines').select_related('order', 'refers').annotate(
|
||||
return self.request.event.invoices.prefetch_related('lines').select_related('order', 'refers').annotate(
|
||||
nr=Concat('prefix', 'invoice_no')
|
||||
)
|
||||
|
||||
@action(detail=True)
|
||||
@action(detail=True, )
|
||||
def download(self, request, **kwargs):
|
||||
invoice = self.get_object()
|
||||
|
||||
@@ -1860,7 +1805,7 @@ class InvoiceViewSet(viewsets.ReadOnlyModelViewSet):
|
||||
return resp
|
||||
|
||||
@action(detail=True, methods=['POST'])
|
||||
def regenerate(self, request, **kwargs):
|
||||
def regenerate(self, request, **kwarts):
|
||||
inv = self.get_object()
|
||||
if inv.canceled:
|
||||
raise ValidationError('The invoice has already been canceled.')
|
||||
@@ -1870,7 +1815,7 @@ class InvoiceViewSet(viewsets.ReadOnlyModelViewSet):
|
||||
raise PermissionDenied('The invoice file is no longer stored on the server.')
|
||||
elif inv.sent_to_organizer:
|
||||
raise PermissionDenied('The invoice file has already been exported.')
|
||||
elif now().astimezone(inv.event.timezone).date() - inv.date > datetime.timedelta(days=1):
|
||||
elif now().astimezone(self.request.event.timezone).date() - inv.date > datetime.timedelta(days=1):
|
||||
raise PermissionDenied('The invoice file is too old to be regenerated.')
|
||||
else:
|
||||
inv = regenerate_invoice(inv)
|
||||
@@ -1885,7 +1830,7 @@ class InvoiceViewSet(viewsets.ReadOnlyModelViewSet):
|
||||
return Response(status=204)
|
||||
|
||||
@action(detail=True, methods=['POST'])
|
||||
def reissue(self, request, **kwargs):
|
||||
def reissue(self, request, **kwarts):
|
||||
inv = self.get_object()
|
||||
if inv.canceled:
|
||||
raise ValidationError('The invoice has already been canceled.')
|
||||
|
||||
@@ -19,6 +19,8 @@
|
||||
# You should have received a copy of the GNU Affero General Public License along with this program. If not, see
|
||||
# <https://www.gnu.org/licenses/>.
|
||||
#
|
||||
import contextlib
|
||||
|
||||
from django.db import transaction
|
||||
from django.db.models import F, Q
|
||||
from django.utils.timezone import now
|
||||
@@ -67,9 +69,30 @@ class VoucherViewSet(viewsets.ModelViewSet):
|
||||
def get_queryset(self):
|
||||
return self.request.event.vouchers.select_related('seat').all()
|
||||
|
||||
@transaction.atomic()
|
||||
def _predict_quota_check(self, data, instance):
|
||||
# This method predicts if Voucher.clean_quota_needs_checking
|
||||
# *migh* later require a quota check. It is only approximate
|
||||
# and returns True a little too often. The point is to avoid
|
||||
# locks when we know we won't need them.
|
||||
if 'allow_ignore_quota' in data and data.get('allow_ignore_quota'):
|
||||
return False
|
||||
if instance and 'allow_ignore_quota' not in data and instance.allow_ignore_quota:
|
||||
return False
|
||||
|
||||
if 'block_quota' in data and not data.get('block_quota'):
|
||||
return False
|
||||
if instance and 'block_quota' not in data and not instance.block_quota:
|
||||
return False
|
||||
|
||||
return True
|
||||
|
||||
def create(self, request, *args, **kwargs):
|
||||
return super().create(request, *args, **kwargs)
|
||||
if self._predict_quota_check(request.data, None):
|
||||
lockfn = request.event.lock
|
||||
else:
|
||||
lockfn = contextlib.suppress # noop context manager
|
||||
with lockfn():
|
||||
return super().create(request, *args, **kwargs)
|
||||
|
||||
def perform_create(self, serializer):
|
||||
serializer.save(event=self.request.event)
|
||||
@@ -85,9 +108,13 @@ class VoucherViewSet(viewsets.ModelViewSet):
|
||||
ctx['event'] = self.request.event
|
||||
return ctx
|
||||
|
||||
@transaction.atomic()
|
||||
def update(self, request, *args, **kwargs):
|
||||
return super().update(request, *args, **kwargs)
|
||||
if self._predict_quota_check(request.data, self.get_object()):
|
||||
lockfn = request.event.lock
|
||||
else:
|
||||
lockfn = contextlib.suppress # noop context manager
|
||||
with lockfn():
|
||||
return super().update(request, *args, **kwargs)
|
||||
|
||||
def perform_update(self, serializer):
|
||||
serializer.save(event=self.request.event)
|
||||
@@ -113,18 +140,22 @@ class VoucherViewSet(viewsets.ModelViewSet):
|
||||
super().perform_destroy(instance)
|
||||
|
||||
@action(detail=False, methods=['POST'])
|
||||
@transaction.atomic()
|
||||
def batch_create(self, request, *args, **kwargs):
|
||||
serializer = self.get_serializer(data=request.data, many=True)
|
||||
serializer.is_valid(raise_exception=True)
|
||||
with transaction.atomic():
|
||||
serializer.save(event=self.request.event)
|
||||
for i, v in enumerate(serializer.instance):
|
||||
v.log_action(
|
||||
'pretix.voucher.added',
|
||||
user=self.request.user,
|
||||
auth=self.request.auth,
|
||||
data=self.request.data[i]
|
||||
)
|
||||
if any(self._predict_quota_check(d, None) for d in request.data):
|
||||
lockfn = request.event.lock
|
||||
else:
|
||||
lockfn = contextlib.suppress # noop context manager
|
||||
with lockfn():
|
||||
serializer = self.get_serializer(data=request.data, many=True)
|
||||
serializer.is_valid(raise_exception=True)
|
||||
with transaction.atomic():
|
||||
serializer.save(event=self.request.event)
|
||||
for i, v in enumerate(serializer.instance):
|
||||
v.log_action(
|
||||
'pretix.voucher.added',
|
||||
user=self.request.user,
|
||||
auth=self.request.auth,
|
||||
data=self.request.data[i]
|
||||
)
|
||||
headers = self.get_success_headers(serializer.data)
|
||||
return Response(serializer.data, status=status.HTTP_201_CREATED, headers=headers)
|
||||
|
||||
@@ -202,21 +202,6 @@ class ParametrizedWaitingListEntryWebhookEvent(ParametrizedWebhookEvent):
|
||||
}
|
||||
|
||||
|
||||
class ParametrizedCustomerWebhookEvent(ParametrizedWebhookEvent):
|
||||
|
||||
def build_payload(self, logentry: LogEntry):
|
||||
customer = logentry.content_object
|
||||
if not customer:
|
||||
return None
|
||||
|
||||
return {
|
||||
'notification_id': logentry.pk,
|
||||
'organizer': customer.organizer.slug,
|
||||
'customer': customer.identifier,
|
||||
'action': logentry.action_type,
|
||||
}
|
||||
|
||||
|
||||
@receiver(register_webhook_events, dispatch_uid="base_register_default_webhook_events")
|
||||
def register_default_webhook_events(sender, **kwargs):
|
||||
return (
|
||||
@@ -365,18 +350,6 @@ def register_default_webhook_events(sender, **kwargs):
|
||||
'pretix.event.orders.waitinglist.voucher_assigned',
|
||||
_('Waiting list entry received voucher'),
|
||||
),
|
||||
ParametrizedCustomerWebhookEvent(
|
||||
'pretix.customer.created',
|
||||
_('Customer account created'),
|
||||
),
|
||||
ParametrizedCustomerWebhookEvent(
|
||||
'pretix.customer.changed',
|
||||
_('Customer account changed'),
|
||||
),
|
||||
ParametrizedCustomerWebhookEvent(
|
||||
'pretix.customer.anonymized',
|
||||
_('Customer account anonymized'),
|
||||
),
|
||||
)
|
||||
|
||||
|
||||
|
||||
@@ -62,27 +62,27 @@ class NamespacedCache:
|
||||
prefix = int(time.time())
|
||||
self.cache.set(self.prefixkey, prefix)
|
||||
|
||||
def set(self, key: str, value: any, timeout: int=300):
|
||||
def set(self, key: str, value: str, timeout: int=300):
|
||||
return self.cache.set(self._prefix_key(key), value, timeout)
|
||||
|
||||
def get(self, key: str) -> any:
|
||||
def get(self, key: str) -> str:
|
||||
return self.cache.get(self._prefix_key(key, known_prefix=self._last_prefix))
|
||||
|
||||
def get_or_set(self, key: str, default: Callable, timeout=300) -> any:
|
||||
def get_or_set(self, key: str, default: Callable, timeout=300) -> str:
|
||||
return self.cache.get_or_set(
|
||||
self._prefix_key(key, known_prefix=self._last_prefix),
|
||||
default=default,
|
||||
timeout=timeout
|
||||
)
|
||||
|
||||
def get_many(self, keys: List[str]) -> Dict[str, any]:
|
||||
def get_many(self, keys: List[str]) -> Dict[str, str]:
|
||||
values = self.cache.get_many([self._prefix_key(key) for key in keys])
|
||||
newvalues = {}
|
||||
for k, v in values.items():
|
||||
newvalues[self._strip_prefix(k)] = v
|
||||
return newvalues
|
||||
|
||||
def set_many(self, values: Dict[str, any], timeout=300):
|
||||
def set_many(self, values: Dict[str, str], timeout=300):
|
||||
newvalues = {}
|
||||
for k, v in values.items():
|
||||
newvalues[self._prefix_key(k)] = v
|
||||
|
||||
@@ -134,11 +134,8 @@ class TemplateBasedMailRenderer(BaseHTMLMailRenderer):
|
||||
def template_name(self):
|
||||
raise NotImplementedError()
|
||||
|
||||
def compile_markdown(self, plaintext):
|
||||
return markdown_compile_email(plaintext)
|
||||
|
||||
def render(self, plain_body: str, plain_signature: str, subject: str, order, position) -> str:
|
||||
body_md = self.compile_markdown(plain_body)
|
||||
body_md = markdown_compile_email(plain_body)
|
||||
htmlctx = {
|
||||
'site': settings.PRETIX_INSTANCE_NAME,
|
||||
'site_url': settings.SITE_URL,
|
||||
@@ -156,7 +153,7 @@ class TemplateBasedMailRenderer(BaseHTMLMailRenderer):
|
||||
|
||||
if plain_signature:
|
||||
signature_md = plain_signature.replace('\n', '<br>\n')
|
||||
signature_md = self.compile_markdown(signature_md)
|
||||
signature_md = markdown_compile_email(signature_md)
|
||||
htmlctx['signature'] = signature_md
|
||||
|
||||
if order:
|
||||
@@ -669,11 +666,6 @@ def base_placeholders(sender, **kwargs):
|
||||
lambda waiting_list_entry: concatenation_for_salutation(waiting_list_entry.name_parts),
|
||||
_("Mr Doe"),
|
||||
))
|
||||
ph.append(SimpleFunctionalMailTextPlaceholder(
|
||||
"name", ["waiting_list_entry"],
|
||||
lambda waiting_list_entry: waiting_list_entry.name or "",
|
||||
_("Mr Doe"),
|
||||
))
|
||||
ph.append(SimpleFunctionalMailTextPlaceholder(
|
||||
"name_for_salutation", ["position_or_address"],
|
||||
lambda position_or_address: concatenation_for_salutation(get_best_name(position_or_address, parts=True)),
|
||||
|
||||
@@ -140,7 +140,7 @@ class BaseExporter:
|
||||
"""
|
||||
return {}
|
||||
|
||||
def render(self, form_data: dict) -> Tuple[str, str, Optional[bytes]]:
|
||||
def render(self, form_data: dict) -> Tuple[str, str, bytes]:
|
||||
"""
|
||||
Render the exported file and return a tuple consisting of a filename, a file type
|
||||
and file content.
|
||||
|
||||
@@ -549,9 +549,7 @@ class OrderListExporter(MultiSheetListExporter):
|
||||
headers.append(_('End date'))
|
||||
headers += [
|
||||
_('Product'),
|
||||
_('Product ID'),
|
||||
_('Variation'),
|
||||
_('Variation ID'),
|
||||
_('Price'),
|
||||
_('Tax rate'),
|
||||
_('Tax rule'),
|
||||
@@ -658,9 +656,7 @@ class OrderListExporter(MultiSheetListExporter):
|
||||
row.append('')
|
||||
row += [
|
||||
str(op.item),
|
||||
str(op.item_id),
|
||||
str(op.variation) if op.variation else '',
|
||||
str(op.variation_id) if op.variation_id else '',
|
||||
op.price,
|
||||
op.tax_rate,
|
||||
str(op.tax_rule) if op.tax_rule else '',
|
||||
|
||||
@@ -60,18 +60,6 @@ def replace_arabic_numbers(inp):
|
||||
return inp.translate(table)
|
||||
|
||||
|
||||
def format_placeholders_help_text(placeholders, event=None):
|
||||
placeholders = [(k, v.render_sample(event) if event else v) for k, v in placeholders.items()]
|
||||
placeholders.sort(key=lambda x: x[0])
|
||||
phs = [
|
||||
'<button type="button" class="content-placeholder" title="%s">{%s}</button>' % (_("Sample: %s") % v if v else "", k)
|
||||
for k, v in placeholders
|
||||
]
|
||||
return _('Available placeholders: {list}').format(
|
||||
list=' '.join(phs)
|
||||
)
|
||||
|
||||
|
||||
class DatePickerWidget(forms.DateInput):
|
||||
def __init__(self, attrs=None, date_format=None):
|
||||
attrs = attrs or {}
|
||||
|
||||
@@ -49,9 +49,6 @@ class BaseMediaType:
|
||||
def handle_unknown(self, organizer, identifier, user, auth):
|
||||
pass
|
||||
|
||||
def handle_new(self, organizer, medium, user, auth):
|
||||
pass
|
||||
|
||||
def __str__(self):
|
||||
return str(self.verbose_name)
|
||||
|
||||
@@ -111,43 +108,9 @@ class NfcUidMediaType(BaseMediaType):
|
||||
return m
|
||||
|
||||
|
||||
class NfcMf0aesMediaType(BaseMediaType):
|
||||
identifier = 'nfc_mf0aes'
|
||||
verbose_name = 'NFC Mifare Ultralight AES'
|
||||
medium_created_by_server = False
|
||||
supports_giftcard = True
|
||||
supports_orderposition = False
|
||||
|
||||
def handle_new(self, organizer, medium, user, auth):
|
||||
from pretix.base.models import GiftCard
|
||||
|
||||
if organizer.settings.get(f'reusable_media_type_{self.identifier}_autocreate_giftcard', as_type=bool):
|
||||
with transaction.atomic():
|
||||
gc = GiftCard.objects.create(
|
||||
issuer=organizer,
|
||||
expires=organizer.default_gift_card_expiry,
|
||||
currency=organizer.settings.get(f'reusable_media_type_{self.identifier}_autocreate_giftcard_currency'),
|
||||
)
|
||||
medium.linked_giftcard = gc
|
||||
medium.save()
|
||||
medium.log_action(
|
||||
'pretix.reusable_medium.linked_giftcard.changed',
|
||||
user=user, auth=auth,
|
||||
data={
|
||||
'linked_giftcard': gc.pk
|
||||
}
|
||||
)
|
||||
gc.log_action(
|
||||
'pretix.giftcards.created',
|
||||
user=user, auth=auth,
|
||||
)
|
||||
return medium
|
||||
|
||||
|
||||
MEDIA_TYPES = {
|
||||
m.identifier: m for m in [
|
||||
BarcodePlainMediaType(),
|
||||
NfcUidMediaType(),
|
||||
NfcMf0aesMediaType(),
|
||||
]
|
||||
}
|
||||
|
||||
@@ -264,7 +264,7 @@ def metric_values():
|
||||
|
||||
# Metrics from redis
|
||||
if settings.HAS_REDIS:
|
||||
for key, value in redis.hscan_iter(REDIS_KEY, count=1000):
|
||||
for key, value in redis.hscan_iter(REDIS_KEY):
|
||||
dkey = key.decode("utf-8")
|
||||
splitted = dkey.split("{", 2)
|
||||
value = float(value.decode("utf-8"))
|
||||
|
||||
@@ -26,7 +26,7 @@ from zoneinfo import ZoneInfo, ZoneInfoNotFoundError
|
||||
from django.conf import settings
|
||||
from django.http import Http404, HttpRequest, HttpResponse
|
||||
from django.middleware.common import CommonMiddleware
|
||||
from django.urls import get_script_prefix, resolve
|
||||
from django.urls import get_script_prefix
|
||||
from django.utils import timezone, translation
|
||||
from django.utils.cache import patch_vary_headers
|
||||
from django.utils.deprecation import MiddlewareMixin
|
||||
@@ -230,8 +230,6 @@ class SecurityMiddleware(MiddlewareMixin):
|
||||
)
|
||||
|
||||
def process_response(self, request, resp):
|
||||
url = resolve(request.path_info)
|
||||
|
||||
if settings.DEBUG and resp.status_code >= 400:
|
||||
# Don't use CSP on debug error page as it breaks of Django's fancy error
|
||||
# pages
|
||||
@@ -251,28 +249,20 @@ class SecurityMiddleware(MiddlewareMixin):
|
||||
|
||||
h = {
|
||||
'default-src': ["{static}"],
|
||||
'script-src': ['{static}'],
|
||||
'script-src': ['{static}', 'https://checkout.stripe.com', 'https://js.stripe.com'],
|
||||
'object-src': ["'none'"],
|
||||
'frame-src': ['{static}'],
|
||||
'frame-src': ['{static}', 'https://checkout.stripe.com', 'https://js.stripe.com'],
|
||||
'style-src': ["{static}", "{media}"],
|
||||
'connect-src': ["{dynamic}", "{media}"],
|
||||
'img-src': ["{static}", "{media}", "data:"] + img_src,
|
||||
'connect-src': ["{dynamic}", "{media}", "https://checkout.stripe.com"],
|
||||
'img-src': ["{static}", "{media}", "data:", "https://*.stripe.com"] + img_src,
|
||||
'font-src': ["{static}"],
|
||||
'media-src': ["{static}", "data:"],
|
||||
# form-action is not only used to match on form actions, but also on URLs
|
||||
# form-actions redirect to. In the context of e.g. payment providers or
|
||||
# single-sign-on this can be nearly anything, so we cannot really restrict
|
||||
# single-sign-on this can be nearly anything so we cannot really restrict
|
||||
# this. However, we'll restrict it to HTTPS.
|
||||
'form-action': ["{dynamic}", "https:"] + (['http:'] if settings.SITE_URL.startswith('http://') else []),
|
||||
}
|
||||
# Only include pay.google.com for wallet detection purposes on the Payment selection page
|
||||
if (
|
||||
url.url_name == "event.order.pay.change" or
|
||||
(url.url_name == "event.checkout" and url.kwargs['step'] == "payment")
|
||||
):
|
||||
h['script-src'].append('https://pay.google.com')
|
||||
h['frame-src'].append('https://pay.google.com')
|
||||
h['connect-src'].append('https://google.com/pay')
|
||||
if settings.LOG_CSP:
|
||||
h['report-uri'] = ["/csp_report/"]
|
||||
if 'Content-Security-Policy' in resp:
|
||||
|
||||
@@ -1,23 +0,0 @@
|
||||
# Generated by Django 4.1.9 on 2023-06-26 10:59
|
||||
|
||||
from django.db import migrations, models
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('pretixbase', '0242_auto_20230512_1008'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AddField(
|
||||
model_name='device',
|
||||
name='os_name',
|
||||
field=models.CharField(max_length=190, null=True),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='device',
|
||||
name='os_version',
|
||||
field=models.CharField(max_length=190, null=True),
|
||||
),
|
||||
]
|
||||
@@ -1,35 +0,0 @@
|
||||
# Generated by Django 3.2.18 on 2023-05-17 11:32
|
||||
|
||||
import django.db.models.deletion
|
||||
from django.db import migrations, models
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('pretixbase', '0243_device_os_name_and_os_version'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AddField(
|
||||
model_name='device',
|
||||
name='rsa_pubkey',
|
||||
field=models.TextField(null=True),
|
||||
),
|
||||
migrations.CreateModel(
|
||||
name='MediumKeySet',
|
||||
fields=[
|
||||
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False)),
|
||||
('public_id', models.BigIntegerField(unique=True)),
|
||||
('media_type', models.CharField(max_length=100)),
|
||||
('active', models.BooleanField(default=True)),
|
||||
('uid_key', models.BinaryField()),
|
||||
('diversification_key', models.BinaryField()),
|
||||
('organizer', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='medium_key_sets', to='pretixbase.organizer')),
|
||||
],
|
||||
),
|
||||
migrations.AddConstraint(
|
||||
model_name='mediumkeyset',
|
||||
constraint=models.UniqueConstraint(condition=models.Q(('active', True)), fields=('organizer', 'media_type'), name='keyset_unique_active'),
|
||||
),
|
||||
]
|
||||
@@ -1,34 +0,0 @@
|
||||
# Generated by Django 4.2.4 on 2023-08-28 12:30
|
||||
|
||||
from django.db import migrations, models
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
dependencies = [
|
||||
("pretixbase", "0244_mediumkeyset"),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AddField(
|
||||
model_name="discount",
|
||||
name="benefit_apply_to_addons",
|
||||
field=models.BooleanField(default=True),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name="discount",
|
||||
name="benefit_ignore_voucher_discounted",
|
||||
field=models.BooleanField(default=False),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name="discount",
|
||||
name="benefit_limit_products",
|
||||
field=models.ManyToManyField(
|
||||
related_name="benefit_discounts", to="pretixbase.item"
|
||||
),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name="discount",
|
||||
name="benefit_same_products",
|
||||
field=models.BooleanField(default=True),
|
||||
),
|
||||
]
|
||||
@@ -97,7 +97,7 @@ def _transactions_mark_order_dirty(order_id, using=None):
|
||||
if getattr(dirty_transactions, 'order_ids', None) is None:
|
||||
dirty_transactions.order_ids = set()
|
||||
|
||||
if _check_for_dirty_orders not in [func for (savepoint_id, func, *__) in conn.run_on_commit]:
|
||||
if _check_for_dirty_orders not in [func for savepoint_id, func in conn.run_on_commit]:
|
||||
transaction.on_commit(_check_for_dirty_orders, using)
|
||||
dirty_transactions.order_ids.clear() # This is necessary to clean up after old threads with rollbacked transactions
|
||||
|
||||
|
||||
@@ -265,16 +265,16 @@ class CheckinList(LoggedModel):
|
||||
# * in pretix.helpers.jsonlogic_boolalg
|
||||
# * in checkinrules.js
|
||||
# * in libpretixsync
|
||||
# * in pretixscan-ios
|
||||
# * in pretixscan-ios (in the future)
|
||||
top_level_operators = {
|
||||
'<', '<=', '>', '>=', '==', '!=', 'inList', 'isBefore', 'isAfter', 'or', 'and'
|
||||
}
|
||||
allowed_operators = top_level_operators | {
|
||||
'buildTime', 'objectList', 'lookup', 'var', 'entries_since', 'entries_before'
|
||||
'buildTime', 'objectList', 'lookup', 'var',
|
||||
}
|
||||
allowed_vars = {
|
||||
'product', 'variation', 'now', 'now_isoweekday', 'entries_number', 'entries_today', 'entries_days',
|
||||
'minutes_since_last_entry', 'minutes_since_first_entry', 'gate',
|
||||
'minutes_since_last_entry', 'minutes_since_first_entry',
|
||||
}
|
||||
if not rules or not isinstance(rules, dict):
|
||||
return rules
|
||||
@@ -299,10 +299,6 @@ class CheckinList(LoggedModel):
|
||||
raise ValidationError(f'Logic variable "{values[0]}" is currently not allowed.')
|
||||
return rules
|
||||
|
||||
if operator in ('entries_since', 'entries_before'):
|
||||
if len(values) != 1 or "buildTime" not in values[0]:
|
||||
raise ValidationError(f'Operator "{operator}" takes exactly one "buildTime" argument.')
|
||||
|
||||
if operator in ('or', 'and') and seen_nonbool:
|
||||
raise ValidationError('You cannot use OR/AND logic on a level below a comparison operator.')
|
||||
|
||||
|
||||
@@ -143,14 +143,6 @@ class Device(LoggedModel):
|
||||
max_length=190,
|
||||
null=True, blank=True
|
||||
)
|
||||
os_name = models.CharField(
|
||||
max_length=190,
|
||||
null=True, blank=True
|
||||
)
|
||||
os_version = models.CharField(
|
||||
max_length=190,
|
||||
null=True, blank=True
|
||||
)
|
||||
software_brand = models.CharField(
|
||||
max_length=190,
|
||||
null=True, blank=True
|
||||
@@ -166,10 +158,6 @@ class Device(LoggedModel):
|
||||
null=True,
|
||||
blank=False
|
||||
)
|
||||
rsa_pubkey = models.TextField(
|
||||
null=True,
|
||||
blank=True,
|
||||
)
|
||||
info = models.JSONField(
|
||||
null=True, blank=True,
|
||||
)
|
||||
|
||||
@@ -99,7 +99,7 @@ class Discount(LoggedModel):
|
||||
)
|
||||
condition_apply_to_addons = models.BooleanField(
|
||||
default=True,
|
||||
verbose_name=_("Count add-on products"),
|
||||
verbose_name=_("Apply to add-on products"),
|
||||
help_text=_("Discounts never apply to bundled products"),
|
||||
)
|
||||
condition_ignore_voucher_discounted = models.BooleanField(
|
||||
@@ -107,7 +107,7 @@ class Discount(LoggedModel):
|
||||
verbose_name=_("Ignore products discounted by a voucher"),
|
||||
help_text=_("If this option is checked, products that already received a discount through a voucher will not "
|
||||
"be considered for this discount. However, products that use a voucher only to e.g. unlock a "
|
||||
"hidden product or gain access to sold-out quota will still be considered."),
|
||||
"hidden product or gain access to sold-out quota will still receive the discount."),
|
||||
)
|
||||
condition_min_count = models.PositiveIntegerField(
|
||||
verbose_name=_('Minimum number of matching products'),
|
||||
@@ -120,19 +120,6 @@ class Discount(LoggedModel):
|
||||
default=Decimal('0.00'),
|
||||
)
|
||||
|
||||
benefit_same_products = models.BooleanField(
|
||||
default=True,
|
||||
verbose_name=_("Apply discount to same set of products"),
|
||||
help_text=_("By default, the discount is applied across the same selection of products than the condition for "
|
||||
"the discount given above. If you want, you can however also select a different selection of "
|
||||
"products.")
|
||||
)
|
||||
benefit_limit_products = models.ManyToManyField(
|
||||
'Item',
|
||||
verbose_name=_("Apply discount to specific products"),
|
||||
related_name='benefit_discounts',
|
||||
blank=True
|
||||
)
|
||||
benefit_discount_matching_percent = models.DecimalField(
|
||||
verbose_name=_('Percentual discount on matching products'),
|
||||
decimal_places=2,
|
||||
@@ -152,18 +139,6 @@ class Discount(LoggedModel):
|
||||
blank=True,
|
||||
validators=[MinValueValidator(1)],
|
||||
)
|
||||
benefit_apply_to_addons = models.BooleanField(
|
||||
default=True,
|
||||
verbose_name=_("Apply to add-on products"),
|
||||
help_text=_("Discounts never apply to bundled products"),
|
||||
)
|
||||
benefit_ignore_voucher_discounted = models.BooleanField(
|
||||
default=False,
|
||||
verbose_name=_("Ignore products discounted by a voucher"),
|
||||
help_text=_("If this option is checked, products that already received a discount through a voucher will not "
|
||||
"be discounted. However, products that use a voucher only to e.g. unlock a hidden product or gain "
|
||||
"access to sold-out quota will still receive the discount."),
|
||||
)
|
||||
|
||||
# more feature ideas:
|
||||
# - max_usages_per_order
|
||||
@@ -212,14 +187,6 @@ class Discount(LoggedModel):
|
||||
'on a minimum value.')
|
||||
)
|
||||
|
||||
if data.get('subevent_mode') == cls.SUBEVENT_MODE_DISTINCT and not data.get('benefit_same_products'):
|
||||
raise ValidationError(
|
||||
{'benefit_same_products': [
|
||||
_('You cannot apply the discount to a different set of products if the discount is only valid '
|
||||
'for bookings of different dates.')
|
||||
]}
|
||||
)
|
||||
|
||||
def allow_delete(self):
|
||||
return not self.orderposition_set.exists()
|
||||
|
||||
@@ -230,7 +197,6 @@ class Discount(LoggedModel):
|
||||
'condition_min_value': self.condition_min_value,
|
||||
'benefit_only_apply_to_cheapest_n_matches': self.benefit_only_apply_to_cheapest_n_matches,
|
||||
'subevent_mode': self.subevent_mode,
|
||||
'benefit_same_products': self.benefit_same_products,
|
||||
})
|
||||
|
||||
def is_available_by_time(self, now_dt=None) -> bool:
|
||||
@@ -241,14 +207,14 @@ class Discount(LoggedModel):
|
||||
return False
|
||||
return True
|
||||
|
||||
def _apply_min_value(self, positions, condition_idx_group, benefit_idx_group, result):
|
||||
if self.condition_min_value and sum(positions[idx][2] for idx in condition_idx_group) < self.condition_min_value:
|
||||
def _apply_min_value(self, positions, idx_group, result):
|
||||
if self.condition_min_value and sum(positions[idx][2] for idx in idx_group) < self.condition_min_value:
|
||||
return
|
||||
|
||||
if self.condition_min_count or self.benefit_only_apply_to_cheapest_n_matches:
|
||||
raise ValueError('Validation invariant violated.')
|
||||
|
||||
for idx in benefit_idx_group:
|
||||
for idx in idx_group:
|
||||
previous_price = positions[idx][2]
|
||||
new_price = round_decimal(
|
||||
previous_price * (Decimal('100.00') - self.benefit_discount_matching_percent) / Decimal('100.00'),
|
||||
@@ -256,8 +222,8 @@ class Discount(LoggedModel):
|
||||
)
|
||||
result[idx] = new_price
|
||||
|
||||
def _apply_min_count(self, positions, condition_idx_group, benefit_idx_group, result):
|
||||
if len(condition_idx_group) < self.condition_min_count:
|
||||
def _apply_min_count(self, positions, idx_group, result):
|
||||
if len(idx_group) < self.condition_min_count:
|
||||
return
|
||||
|
||||
if not self.condition_min_count or self.condition_min_value:
|
||||
@@ -267,17 +233,15 @@ class Discount(LoggedModel):
|
||||
if not self.condition_min_count:
|
||||
raise ValueError('Validation invariant violated.')
|
||||
|
||||
condition_idx_group = sorted(condition_idx_group, key=lambda idx: (positions[idx][2], -idx)) # sort by line_price
|
||||
benefit_idx_group = sorted(benefit_idx_group, key=lambda idx: (positions[idx][2], -idx)) # sort by line_price
|
||||
idx_group = sorted(idx_group, key=lambda idx: (positions[idx][2], -idx)) # sort by line_price
|
||||
|
||||
# Prevent over-consuming of items, i.e. if our discount is "buy 2, get 1 free", we only
|
||||
# want to match multiples of 3
|
||||
n_groups = min(len(condition_idx_group) // self.condition_min_count, len(benefit_idx_group))
|
||||
consume_idx = condition_idx_group[:n_groups * self.condition_min_count]
|
||||
benefit_idx = benefit_idx_group[:n_groups * self.benefit_only_apply_to_cheapest_n_matches]
|
||||
consume_idx = idx_group[:len(idx_group) // self.condition_min_count * self.condition_min_count]
|
||||
benefit_idx = idx_group[:len(idx_group) // self.condition_min_count * self.benefit_only_apply_to_cheapest_n_matches]
|
||||
else:
|
||||
consume_idx = condition_idx_group
|
||||
benefit_idx = benefit_idx_group
|
||||
consume_idx = idx_group
|
||||
benefit_idx = idx_group
|
||||
|
||||
for idx in benefit_idx:
|
||||
previous_price = positions[idx][2]
|
||||
@@ -312,7 +276,7 @@ class Discount(LoggedModel):
|
||||
limit_products = {p.pk for p in self.condition_limit_products.all()}
|
||||
|
||||
# First, filter out everything not even covered by our product scope
|
||||
condition_candidates = [
|
||||
initial_candidates = [
|
||||
idx
|
||||
for idx, (item_id, subevent_id, line_price_gross, is_addon_to, voucher_discount) in positions.items()
|
||||
if (
|
||||
@@ -322,25 +286,11 @@ class Discount(LoggedModel):
|
||||
)
|
||||
]
|
||||
|
||||
if self.benefit_same_products:
|
||||
benefit_candidates = list(condition_candidates)
|
||||
else:
|
||||
benefit_products = {p.pk for p in self.benefit_limit_products.all()}
|
||||
benefit_candidates = [
|
||||
idx
|
||||
for idx, (item_id, subevent_id, line_price_gross, is_addon_to, voucher_discount) in positions.items()
|
||||
if (
|
||||
item_id in benefit_products and
|
||||
(self.benefit_apply_to_addons or not is_addon_to) and
|
||||
(not self.benefit_ignore_voucher_discounted or voucher_discount is None or voucher_discount == Decimal('0.00'))
|
||||
)
|
||||
]
|
||||
|
||||
if self.subevent_mode == self.SUBEVENT_MODE_MIXED: # also applies to non-series events
|
||||
if self.condition_min_count:
|
||||
self._apply_min_count(positions, condition_candidates, benefit_candidates, result)
|
||||
self._apply_min_count(positions, initial_candidates, result)
|
||||
else:
|
||||
self._apply_min_value(positions, condition_candidates, benefit_candidates, result)
|
||||
self._apply_min_value(positions, initial_candidates, result)
|
||||
|
||||
elif self.subevent_mode == self.SUBEVENT_MODE_SAME:
|
||||
def key(idx):
|
||||
@@ -349,18 +299,17 @@ class Discount(LoggedModel):
|
||||
# Build groups of candidates with the same subevent, then apply our regular algorithm
|
||||
# to each group
|
||||
|
||||
_groups = groupby(sorted(condition_candidates, key=key), key=key)
|
||||
candidate_groups = [(k, list(g)) for k, g in _groups]
|
||||
_groups = groupby(sorted(initial_candidates, key=key), key=key)
|
||||
candidate_groups = [list(g) for k, g in _groups]
|
||||
|
||||
for subevent_id, g in candidate_groups:
|
||||
benefit_g = [idx for idx in benefit_candidates if positions[idx][1] == subevent_id]
|
||||
for g in candidate_groups:
|
||||
if self.condition_min_count:
|
||||
self._apply_min_count(positions, g, benefit_g, result)
|
||||
self._apply_min_count(positions, g, result)
|
||||
else:
|
||||
self._apply_min_value(positions, g, benefit_g, result)
|
||||
self._apply_min_value(positions, g, result)
|
||||
|
||||
elif self.subevent_mode == self.SUBEVENT_MODE_DISTINCT:
|
||||
if self.condition_min_value or not self.benefit_same_products:
|
||||
if self.condition_min_value:
|
||||
raise ValueError('Validation invariant violated.')
|
||||
|
||||
# Build optimal groups of candidates with distinct subevents, then apply our regular algorithm
|
||||
@@ -387,7 +336,7 @@ class Discount(LoggedModel):
|
||||
candidates = []
|
||||
cardinality = None
|
||||
for se, l in subevent_to_idx.items():
|
||||
l = [ll for ll in l if ll in condition_candidates and ll not in current_group]
|
||||
l = [ll for ll in l if ll in initial_candidates and ll not in current_group]
|
||||
if cardinality and len(l) != cardinality:
|
||||
continue
|
||||
if se not in {positions[idx][1] for idx in current_group}:
|
||||
@@ -424,5 +373,5 @@ class Discount(LoggedModel):
|
||||
break
|
||||
|
||||
for g in candidate_groups:
|
||||
self._apply_min_count(positions, g, g, result)
|
||||
self._apply_min_count(positions, g, result)
|
||||
return result
|
||||
|
||||
@@ -743,7 +743,12 @@ class Event(EventMixin, LoggedModel):
|
||||
return ObjectRelatedCache(self)
|
||||
|
||||
def lock(self):
|
||||
raise NotImplementedError("this method has been removed")
|
||||
"""
|
||||
Returns a contextmanager that can be used to lock an event for bookings.
|
||||
"""
|
||||
from pretix.base.services import locking
|
||||
|
||||
return locking.LockManager(self)
|
||||
|
||||
def get_mail_backend(self, timeout=None):
|
||||
"""
|
||||
@@ -902,18 +907,14 @@ class Event(EventMixin, LoggedModel):
|
||||
self.items.filter(hidden_if_available_id=oldid).update(hidden_if_available=q)
|
||||
|
||||
for d in Discount.objects.filter(event=other).prefetch_related('condition_limit_products'):
|
||||
c_items = list(d.condition_limit_products.all())
|
||||
b_items = list(d.benefit_limit_products.all())
|
||||
items = list(d.condition_limit_products.all())
|
||||
d.pk = None
|
||||
d.event = self
|
||||
d.save(force_insert=True)
|
||||
d.log_action('pretix.object.cloned')
|
||||
for i in c_items:
|
||||
for i in items:
|
||||
if i.pk in item_map:
|
||||
d.condition_limit_products.add(item_map[i.pk])
|
||||
for i in b_items:
|
||||
if i.pk in item_map:
|
||||
d.benefit_limit_products.add(item_map[i.pk])
|
||||
|
||||
question_map = {}
|
||||
for q in Question.objects.filter(event=other).prefetch_related('items', 'options'):
|
||||
|
||||
@@ -43,7 +43,6 @@ from typing import Optional, Tuple
|
||||
from zoneinfo import ZoneInfo
|
||||
|
||||
import dateutil.parser
|
||||
import django_redis
|
||||
from dateutil.tz import datetime_exists
|
||||
from django.conf import settings
|
||||
from django.core.exceptions import ValidationError
|
||||
@@ -58,6 +57,7 @@ from django.utils.functional import cached_property
|
||||
from django.utils.timezone import is_naive, make_aware, now
|
||||
from django.utils.translation import gettext_lazy as _, pgettext_lazy
|
||||
from django_countries.fields import Country
|
||||
from django_redis import get_redis_connection
|
||||
from django_scopes import ScopedManager
|
||||
from i18nfield.fields import I18nCharField, I18nTextField
|
||||
|
||||
@@ -1463,7 +1463,7 @@ class Question(LoggedModel):
|
||||
(TYPE_PHONENUMBER, _("Phone number")),
|
||||
)
|
||||
UNLOCALIZED_TYPES = [TYPE_DATE, TYPE_TIME, TYPE_DATETIME]
|
||||
ASK_DURING_CHECKIN_UNSUPPORTED = []
|
||||
ASK_DURING_CHECKIN_UNSUPPORTED = [TYPE_PHONENUMBER]
|
||||
|
||||
event = models.ForeignKey(
|
||||
Event,
|
||||
@@ -1910,13 +1910,8 @@ class Quota(LoggedModel):
|
||||
|
||||
def rebuild_cache(self, now_dt=None):
|
||||
if settings.HAS_REDIS:
|
||||
rc = django_redis.get_redis_connection("redis")
|
||||
p = rc.pipeline()
|
||||
p.hdel(f'quotas:{self.event_id}:availabilitycache', str(self.pk))
|
||||
p.hdel(f'quotas:{self.event_id}:availabilitycache:nocw', str(self.pk))
|
||||
p.hdel(f'quotas:{self.event_id}:availabilitycache:igcl', str(self.pk))
|
||||
p.hdel(f'quotas:{self.event_id}:availabilitycache:nocw:igcl', str(self.pk))
|
||||
p.execute()
|
||||
rc = get_redis_connection("redis")
|
||||
rc.hdel(f'quotas:{self.event_id}:availabilitycache', str(self.pk))
|
||||
self.availability(now_dt=now_dt)
|
||||
|
||||
def availability(
|
||||
|
||||
@@ -88,7 +88,9 @@ class LogEntry(models.Model):
|
||||
|
||||
class Meta:
|
||||
ordering = ('-datetime', '-id')
|
||||
indexes = [models.Index(fields=["datetime", "id"])]
|
||||
index_together = [
|
||||
['datetime', 'id']
|
||||
]
|
||||
|
||||
def display(self):
|
||||
from ..signals import logentry_display
|
||||
|
||||
@@ -121,30 +121,5 @@ class ReusableMedium(LoggedModel):
|
||||
|
||||
class Meta:
|
||||
unique_together = (("identifier", "type", "organizer"),)
|
||||
indexes = [
|
||||
models.Index(fields=("identifier", "type", "organizer")),
|
||||
models.Index(fields=("updated", "id")),
|
||||
]
|
||||
index_together = (("identifier", "type", "organizer"), ("updated", "id"))
|
||||
ordering = "identifier", "type", "organizer"
|
||||
|
||||
|
||||
class MediumKeySet(models.Model):
|
||||
organizer = models.ForeignKey('Organizer', on_delete=models.CASCADE, related_name='medium_key_sets')
|
||||
public_id = models.BigIntegerField(
|
||||
unique=True,
|
||||
)
|
||||
media_type = models.CharField(max_length=100)
|
||||
active = models.BooleanField(default=True)
|
||||
uid_key = models.BinaryField()
|
||||
diversification_key = models.BinaryField()
|
||||
|
||||
objects = ScopedManager(organizer='organizer')
|
||||
|
||||
class Meta:
|
||||
constraints = [
|
||||
models.UniqueConstraint(
|
||||
fields=["organizer", "media_type"],
|
||||
condition=Q(active=True),
|
||||
name="keyset_unique_active",
|
||||
),
|
||||
]
|
||||
|
||||
@@ -37,13 +37,10 @@ import copy
|
||||
import hashlib
|
||||
import json
|
||||
import logging
|
||||
import operator
|
||||
import string
|
||||
from collections import Counter
|
||||
from datetime import datetime, time, timedelta
|
||||
from decimal import Decimal
|
||||
from functools import reduce
|
||||
from time import sleep
|
||||
from typing import Any, Dict, List, Union
|
||||
from zoneinfo import ZoneInfo
|
||||
|
||||
@@ -78,6 +75,7 @@ from pretix.base.email import get_email_context
|
||||
from pretix.base.i18n import language
|
||||
from pretix.base.models import Customer, User
|
||||
from pretix.base.reldate import RelativeDateWrapper
|
||||
from pretix.base.services.locking import LOCK_TIMEOUT, NoLockManager
|
||||
from pretix.base.settings import PERSON_NAME_SCHEMES
|
||||
from pretix.base.signals import order_gracefully_delete
|
||||
|
||||
@@ -85,7 +83,6 @@ from ...helpers import OF_SELF
|
||||
from ...helpers.countries import CachedCountries, FastCountryField
|
||||
from ...helpers.format import format_map
|
||||
from ...helpers.names import build_name
|
||||
from ...testutils.middleware import debugflags_var
|
||||
from ._transactions import (
|
||||
_fail, _transactions_mark_order_clean, _transactions_mark_order_dirty,
|
||||
)
|
||||
@@ -273,9 +270,9 @@ class Order(LockModel, LoggedModel):
|
||||
verbose_name = _("Order")
|
||||
verbose_name_plural = _("Orders")
|
||||
ordering = ("-datetime", "-pk")
|
||||
indexes = [
|
||||
models.Index(fields=["datetime", "id"]),
|
||||
models.Index(fields=["last_modified", "id"]),
|
||||
index_together = [
|
||||
["datetime", "id"],
|
||||
["last_modified", "id"],
|
||||
]
|
||||
|
||||
def __str__(self):
|
||||
@@ -828,7 +825,7 @@ class Order(LockModel, LoggedModel):
|
||||
if cp.has_checkin:
|
||||
return False
|
||||
|
||||
if self.event.settings.get('invoice_address_asked', as_type=bool) or self.event.settings.get('invoice_name_required', as_type=bool):
|
||||
if self.event.settings.get('invoice_address_asked', as_type=bool):
|
||||
return True
|
||||
ask_names = self.event.settings.get('attendee_names_asked', as_type=bool)
|
||||
for cp in positions:
|
||||
@@ -899,34 +896,7 @@ class Order(LockModel, LoggedModel):
|
||||
), tz)
|
||||
return term_last
|
||||
|
||||
@property
|
||||
def payment_term_expire_date(self):
|
||||
delay = self.event.settings.get('payment_term_expire_delay_days', as_type=int)
|
||||
if not delay: # performance saver + backwards compatibility
|
||||
return self.expires
|
||||
|
||||
term_last = self.payment_term_last
|
||||
if term_last and self.expires > term_last: # backwards compatibility
|
||||
return self.expires
|
||||
|
||||
expires = self.expires.date() + timedelta(days=delay)
|
||||
if self.event.settings.get('payment_term_weekdays'):
|
||||
if expires.weekday() == 5:
|
||||
expires += timedelta(days=2)
|
||||
elif expires.weekday() == 6:
|
||||
expires += timedelta(days=1)
|
||||
|
||||
tz = ZoneInfo(self.event.settings.timezone)
|
||||
expires = make_aware(datetime.combine(
|
||||
expires,
|
||||
time(hour=23, minute=59, second=59)
|
||||
), tz)
|
||||
if term_last:
|
||||
return min(expires, term_last)
|
||||
else:
|
||||
return expires
|
||||
|
||||
def _can_be_paid(self, count_waitinglist=True, ignore_date=False, force=False, lock=False) -> Union[bool, str]:
|
||||
def _can_be_paid(self, count_waitinglist=True, ignore_date=False, force=False) -> Union[bool, str]:
|
||||
error_messages = {
|
||||
'late_lastdate': _("The payment can not be accepted as the last date of payments configured in the "
|
||||
"payment settings is over."),
|
||||
@@ -947,11 +917,10 @@ class Order(LockModel, LoggedModel):
|
||||
if not self.event.settings.get('payment_term_accept_late') and not ignore_date and not force:
|
||||
return error_messages['late']
|
||||
|
||||
return self._is_still_available(count_waitinglist=count_waitinglist, force=force, lock=lock)
|
||||
return self._is_still_available(count_waitinglist=count_waitinglist, force=force)
|
||||
|
||||
def _is_still_available(self, now_dt: datetime=None, count_waitinglist=True, lock=False, force=False,
|
||||
def _is_still_available(self, now_dt: datetime=None, count_waitinglist=True, force=False,
|
||||
check_voucher_usage=False, check_memberships=False) -> Union[bool, str]:
|
||||
from pretix.base.services.locking import lock_objects
|
||||
from pretix.base.services.memberships import (
|
||||
validate_memberships_in_order,
|
||||
)
|
||||
@@ -970,21 +939,10 @@ class Order(LockModel, LoggedModel):
|
||||
try:
|
||||
if check_memberships:
|
||||
try:
|
||||
validate_memberships_in_order(self.customer, positions, self.event, lock=lock, testmode=self.testmode)
|
||||
validate_memberships_in_order(self.customer, positions, self.event, lock=False, testmode=self.testmode)
|
||||
except ValidationError as e:
|
||||
raise Quota.QuotaExceededException(e.message)
|
||||
|
||||
for cp in positions:
|
||||
cp._cached_quotas = list(cp.quotas) if not force else []
|
||||
|
||||
if lock:
|
||||
lock_objects(
|
||||
[q for q in reduce(operator.or_, (set(cp._cached_quotas) for cp in positions), set()) if q.size is not None] +
|
||||
[op.voucher for op in positions if op.voucher and not force] +
|
||||
[op.seat for op in positions if op.seat],
|
||||
shared_lock_objects=[self.event]
|
||||
)
|
||||
|
||||
for i, op in enumerate(positions):
|
||||
if op.seat:
|
||||
if not op.seat.is_available(ignore_orderpos=op):
|
||||
@@ -1009,7 +967,7 @@ class Order(LockModel, LoggedModel):
|
||||
voucher=op.voucher.code
|
||||
))
|
||||
|
||||
quotas = op._cached_quotas
|
||||
quotas = list(op.quotas)
|
||||
if len(quotas) == 0:
|
||||
raise Quota.QuotaExceededException(error_messages['unavailable'].format(
|
||||
item=str(op.item) + (' - ' + str(op.variation) if op.variation else '')
|
||||
@@ -1031,9 +989,6 @@ class Order(LockModel, LoggedModel):
|
||||
))
|
||||
except Quota.QuotaExceededException as e:
|
||||
return str(e)
|
||||
|
||||
if 'sleep-after-quota-check' in debugflags_var.get():
|
||||
sleep(2)
|
||||
return True
|
||||
|
||||
def send_mail(self, subject: Union[str, LazyI18nString], template: Union[str, LazyI18nString],
|
||||
@@ -1665,10 +1620,9 @@ class OrderPayment(models.Model):
|
||||
return self.order.event.get_payment_providers(cached=True).get(self.provider)
|
||||
|
||||
@transaction.atomic()
|
||||
def _mark_paid_inner(self, force, count_waitinglist, user, auth, ignore_date=False, overpaid=False, lock=False):
|
||||
def _mark_paid_inner(self, force, count_waitinglist, user, auth, ignore_date=False, overpaid=False):
|
||||
from pretix.base.signals import order_paid
|
||||
can_be_paid = self.order._can_be_paid(count_waitinglist=count_waitinglist, ignore_date=ignore_date, force=force,
|
||||
lock=lock)
|
||||
can_be_paid = self.order._can_be_paid(count_waitinglist=count_waitinglist, ignore_date=ignore_date, force=force)
|
||||
if can_be_paid is not True:
|
||||
self.order.log_action('pretix.event.order.quotaexceeded', {
|
||||
'message': can_be_paid
|
||||
@@ -1691,13 +1645,12 @@ class OrderPayment(models.Model):
|
||||
if status_change:
|
||||
self.order.create_transactions()
|
||||
|
||||
def fail(self, info=None, user=None, auth=None, log_data=None, send_mail=True):
|
||||
def fail(self, info=None, user=None, auth=None, log_data=None):
|
||||
"""
|
||||
Marks the order as failed and sets info to ``info``, but only if the order is in ``created`` or ``pending``
|
||||
state. This is equivalent to setting ``state`` to ``OrderPayment.PAYMENT_STATE_FAILED`` and logging a failure,
|
||||
but it adds strong database locking since we do not want to report a failure for an order that has just
|
||||
but it adds strong database logging since we do not want to report a failure for an order that has just
|
||||
been marked as paid.
|
||||
:param send_mail: Whether an email should be sent to the user about this event (default: ``True``).
|
||||
"""
|
||||
with transaction.atomic():
|
||||
locked_instance = OrderPayment.objects.select_for_update(of=OF_SELF).get(pk=self.pk)
|
||||
@@ -1722,17 +1675,6 @@ class OrderPayment(models.Model):
|
||||
'info': info,
|
||||
'data': log_data,
|
||||
}, user=user, auth=auth)
|
||||
|
||||
if send_mail:
|
||||
with language(self.order.locale, self.order.event.settings.region):
|
||||
email_subject = self.order.event.settings.mail_subject_order_payment_failed
|
||||
email_template = self.order.event.settings.mail_text_order_payment_failed
|
||||
email_context = get_email_context(event=self.order.event, order=self.order)
|
||||
self.order.send_mail(
|
||||
email_subject, email_template, email_context,
|
||||
'pretix.event.order.email.payment_failed', user=user, auth=auth,
|
||||
)
|
||||
|
||||
return True
|
||||
|
||||
def confirm(self, count_waitinglist=True, send_mail=True, force=False, user=None, auth=None, mail_text='',
|
||||
@@ -1799,24 +1741,25 @@ class OrderPayment(models.Model):
|
||||
))
|
||||
return
|
||||
|
||||
with transaction.atomic():
|
||||
self._mark_order_paid(count_waitinglist, send_mail, force, user, auth, mail_text, ignore_date, lock, payment_sum - refund_sum,
|
||||
generate_invoice)
|
||||
self._mark_order_paid(count_waitinglist, send_mail, force, user, auth, mail_text, ignore_date, lock, payment_sum - refund_sum,
|
||||
generate_invoice)
|
||||
|
||||
def _mark_order_paid(self, count_waitinglist=True, send_mail=True, force=False, user=None, auth=None, mail_text='',
|
||||
ignore_date=False, lock=True, payment_refund_sum=0, allow_generate_invoice=True):
|
||||
from pretix.base.services.invoices import (
|
||||
generate_invoice, invoice_qualified,
|
||||
)
|
||||
from pretix.base.services.locking import LOCK_TRUST_WINDOW
|
||||
|
||||
if lock and self.order.status == Order.STATUS_PENDING and self.order.expires > now() + timedelta(seconds=LOCK_TRUST_WINDOW):
|
||||
if (self.order.status == Order.STATUS_PENDING and self.order.expires > now() + timedelta(seconds=LOCK_TIMEOUT * 2)) or not lock:
|
||||
# Performance optimization. In this case, there's really no reason to lock everything and an atomic
|
||||
# database transaction is more than enough.
|
||||
lock = False
|
||||
lockfn = NoLockManager
|
||||
else:
|
||||
lockfn = self.order.event.lock
|
||||
|
||||
self._mark_paid_inner(force, count_waitinglist, user, auth, overpaid=payment_refund_sum > self.order.total,
|
||||
ignore_date=ignore_date, lock=lock)
|
||||
with lockfn():
|
||||
self._mark_paid_inner(force, count_waitinglist, user, auth, overpaid=payment_refund_sum > self.order.total,
|
||||
ignore_date=ignore_date)
|
||||
|
||||
invoice = None
|
||||
if invoice_qualified(self.order) and allow_generate_invoice:
|
||||
@@ -2629,7 +2572,7 @@ class OrderPosition(AbstractPosition):
|
||||
with language(self.order.locale, self.order.event.settings.region):
|
||||
email_template = self.event.settings.mail_text_resend_link
|
||||
email_context = get_email_context(event=self.order.event, order=self.order, position=self)
|
||||
email_subject = self.event.settings.mail_subject_resend_link_attendee
|
||||
email_subject = self.event.settings.mail_subject_resend_link
|
||||
self.send_mail(
|
||||
email_subject, email_template, email_context,
|
||||
'pretix.event.order.email.resend', user=user, auth=auth,
|
||||
@@ -2774,8 +2717,8 @@ class Transaction(models.Model):
|
||||
|
||||
class Meta:
|
||||
ordering = 'datetime', 'pk'
|
||||
indexes = [
|
||||
models.Index(fields=['datetime', 'id'])
|
||||
index_together = [
|
||||
['datetime', 'id']
|
||||
]
|
||||
|
||||
def save(self, *args, **kwargs):
|
||||
|
||||
@@ -340,17 +340,10 @@ class TaxRule(LoggedModel):
|
||||
rules = self._custom_rules
|
||||
if invoice_address:
|
||||
for r in rules:
|
||||
if r['country'] == 'ZZ': # Rule: Any country
|
||||
pass
|
||||
elif r['country'] == 'EU': # Rule: Any EU country
|
||||
if not is_eu_country(invoice_address.country):
|
||||
continue
|
||||
elif '-' in r['country']: # Rule: Specific country and state
|
||||
if r['country'] != str(invoice_address.country) + '-' + str(invoice_address.state):
|
||||
continue
|
||||
else: # Rule: Specific country
|
||||
if r['country'] != str(invoice_address.country):
|
||||
continue
|
||||
if r['country'] == 'EU' and not is_eu_country(invoice_address.country):
|
||||
continue
|
||||
if r['country'] not in ('ZZ', 'EU') and r['country'] != str(invoice_address.country):
|
||||
continue
|
||||
if r['address_type'] == 'individual' and invoice_address.is_business:
|
||||
continue
|
||||
if r['address_type'] in ('business', 'business_vat_id') and not invoice_address.is_business:
|
||||
|
||||
@@ -435,37 +435,28 @@ class Voucher(LoggedModel):
|
||||
|
||||
@staticmethod
|
||||
def clean_quota_check(data, cnt, old_instance, event, quota, item, variation):
|
||||
from ..services.locking import lock_objects
|
||||
from ..services.quotas import QuotaAvailability
|
||||
|
||||
old_quotas = Voucher.clean_quota_get_ignored(old_instance)
|
||||
|
||||
if event.has_subevents and data.get('block_quota') and not data.get('subevent'):
|
||||
raise ValidationError(_('If you want this voucher to block quota, you need to select a specific date.'))
|
||||
|
||||
if quota:
|
||||
new_quotas = {quota}
|
||||
if quota in old_quotas:
|
||||
return
|
||||
else:
|
||||
avail = quota.availability(count_waitinglist=False)
|
||||
elif item and item.has_variations and not variation:
|
||||
raise ValidationError(_('You can only block quota if you specify a specific product variation. '
|
||||
'Otherwise it might be unclear which quotas to block.'))
|
||||
elif item and variation:
|
||||
new_quotas = set(variation.quotas.filter(subevent=data.get('subevent')))
|
||||
avail = variation.check_quotas(ignored_quotas=old_quotas, subevent=data.get('subevent'))
|
||||
elif item and not item.has_variations:
|
||||
new_quotas = set(item.quotas.filter(subevent=data.get('subevent')))
|
||||
avail = item.check_quotas(ignored_quotas=old_quotas, subevent=data.get('subevent'))
|
||||
else:
|
||||
raise ValidationError(_('You need to select a specific product or quota if this voucher should reserve '
|
||||
'tickets.'))
|
||||
|
||||
if not (new_quotas - old_quotas):
|
||||
return
|
||||
|
||||
lock_objects([q for q in (new_quotas - old_quotas) if q.size is not None], shared_lock_objects=[event])
|
||||
|
||||
qa = QuotaAvailability(count_waitinglist=False)
|
||||
qa.queue(*(new_quotas - old_quotas))
|
||||
qa.compute()
|
||||
|
||||
if any(r[0] != Quota.AVAILABILITY_OK or (r[1] is not None and r[1] < cnt) for r in qa.results.values()):
|
||||
if avail[0] != Quota.AVAILABILITY_OK or (avail[1] is not None and avail[1] < cnt):
|
||||
raise ValidationError(_('You cannot create a voucher that blocks quota as the selected product or '
|
||||
'quota is currently sold out or completely reserved.'))
|
||||
|
||||
|
||||
@@ -805,7 +805,7 @@ class QuestionColumn(ImportColumn):
|
||||
return self.q.clean_answer(value)
|
||||
|
||||
def assign(self, value, order, position, invoice_address, **kwargs):
|
||||
if value is not None:
|
||||
if value:
|
||||
if not hasattr(order, '_answers'):
|
||||
order._answers = []
|
||||
if isinstance(value, QuestionOption):
|
||||
@@ -830,12 +830,13 @@ class QuestionColumn(ImportColumn):
|
||||
class CustomerColumn(ImportColumn):
|
||||
identifier = 'customer'
|
||||
verbose_name = gettext_lazy('Customer')
|
||||
default_value = None
|
||||
|
||||
def clean(self, value, previous_values):
|
||||
if value:
|
||||
try:
|
||||
value = self.event.organizer.customers.get(
|
||||
Q(identifier=value) | Q(email__iexact=value) | Q(external_identifier=value)
|
||||
Q(identifier=value) | Q(email=value) | Q(external_identifier=value)
|
||||
)
|
||||
except Customer.MultipleObjectsReturned:
|
||||
value = self.event.organizer.customers.get(
|
||||
|
||||
@@ -78,16 +78,6 @@ from pretix.presale.views.cart import cart_session, get_or_create_cart_id
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class WalletQueries:
|
||||
APPLEPAY = 'applepay'
|
||||
GOOGLEPAY = 'googlepay'
|
||||
|
||||
WALLETS = (
|
||||
(APPLEPAY, pgettext_lazy('payment', 'Apple Pay')),
|
||||
(GOOGLEPAY, pgettext_lazy('payment', 'Google Pay')),
|
||||
)
|
||||
|
||||
|
||||
class PaymentProviderForm(Form):
|
||||
def clean(self):
|
||||
cleaned_data = super().clean()
|
||||
@@ -336,12 +326,6 @@ class BasePaymentProvider:
|
||||
help_text=_('Users will not be able to choose this payment provider after the given date.'),
|
||||
required=False,
|
||||
)),
|
||||
('_availability_start',
|
||||
RelativeDateField(
|
||||
label=_('Available from'),
|
||||
help_text=_('Users will not be able to choose this payment provider before the given date.'),
|
||||
required=False,
|
||||
)),
|
||||
('_total_min',
|
||||
forms.DecimalField(
|
||||
label=_('Minimum order total'),
|
||||
@@ -447,31 +431,11 @@ class BasePaymentProvider:
|
||||
'Share this link with customers who should use this payment method.'
|
||||
),
|
||||
)),
|
||||
('_prevent_reminder_mail',
|
||||
forms.BooleanField(
|
||||
label=_('Do not send a payment reminder mail'),
|
||||
help_text=_('Users will not receive a reminder mail to pay for their order before it expires if '
|
||||
'they have chosen this payment method.'),
|
||||
required=False,
|
||||
)),
|
||||
])
|
||||
d['_restricted_countries']._as_type = list
|
||||
d['_restrict_to_sales_channels']._as_type = list
|
||||
return d
|
||||
|
||||
@property
|
||||
def walletqueries(self):
|
||||
"""
|
||||
.. warning:: This property is considered **experimental**. It might change or get removed at any time without
|
||||
prior notice.
|
||||
|
||||
A list of wallet payment methods that should be dynamically joined to the public name of the payment method,
|
||||
if they are available to the user.
|
||||
The detection is made on a best effort basis with no guarantees of correctness and actual availability.
|
||||
Wallets that pretix can check for are exposed through ``pretix.base.payment.WalletQueries``.
|
||||
"""
|
||||
return []
|
||||
|
||||
def settings_form_clean(self, cleaned_data):
|
||||
"""
|
||||
Overriding this method allows you to inject custom validation into the settings form.
|
||||
@@ -510,14 +474,6 @@ class BasePaymentProvider:
|
||||
if order.status == Order.STATUS_PAID:
|
||||
return _('paid')
|
||||
|
||||
def prevent_reminder_mail(self, order: Order, payment: OrderPayment) -> bool:
|
||||
"""
|
||||
This is called when a periodic task runs and sends out reminder mails to orders that have not been paid yet
|
||||
and are soon expiring.
|
||||
The default implementation returns the content of the _prevent_reminder_mail configuration variable (a boolean value).
|
||||
"""
|
||||
return self.settings.get('_prevent_reminder_mail', as_type=bool, default=False)
|
||||
|
||||
@property
|
||||
def payment_form_fields(self) -> dict:
|
||||
"""
|
||||
@@ -560,65 +516,40 @@ class BasePaymentProvider:
|
||||
|
||||
return form
|
||||
|
||||
def _absolute_availability_date(self, rel_date, cart_id=None, order=None, aggregate_fn=min):
|
||||
if not rel_date:
|
||||
return None
|
||||
if self.event.has_subevents and cart_id:
|
||||
dates = [
|
||||
rel_date.datetime(se).date()
|
||||
for se in self.event.subevents.filter(
|
||||
id__in=CartPosition.objects.filter(
|
||||
cart_id=cart_id, event=self.event
|
||||
).values_list('subevent', flat=True)
|
||||
)
|
||||
]
|
||||
return aggregate_fn(dates) if dates else None
|
||||
elif self.event.has_subevents and order:
|
||||
dates = [
|
||||
rel_date.datetime(se).date()
|
||||
for se in self.event.subevents.filter(
|
||||
id__in=order.positions.values_list('subevent', flat=True)
|
||||
)
|
||||
]
|
||||
return aggregate_fn(dates) if dates else None
|
||||
elif self.event.has_subevents:
|
||||
raise NotImplementedError('Payment provider is not subevent-ready.')
|
||||
else:
|
||||
return rel_date.datetime(self.event).date()
|
||||
|
||||
def _is_available_by_time(self, now_dt=None, cart_id=None, order=None):
|
||||
def _is_still_available(self, now_dt=None, cart_id=None, order=None):
|
||||
now_dt = now_dt or now()
|
||||
tz = ZoneInfo(self.event.settings.timezone)
|
||||
|
||||
try:
|
||||
availability_start = self._absolute_availability_date(
|
||||
self.settings.get('_availability_start', as_type=RelativeDateWrapper),
|
||||
cart_id,
|
||||
order,
|
||||
# In an event series, we use min() for the start as well. This might be inconsistent with using min() for
|
||||
# for the end, but makes it harder to put one self into a situation where no payment provider is available.
|
||||
aggregate_fn=min
|
||||
)
|
||||
availability_date = self.settings.get('_availability_date', as_type=RelativeDateWrapper)
|
||||
if availability_date:
|
||||
if self.event.has_subevents and cart_id:
|
||||
dates = [
|
||||
availability_date.datetime(se).date()
|
||||
for se in self.event.subevents.filter(
|
||||
id__in=CartPosition.objects.filter(
|
||||
cart_id=cart_id, event=self.event
|
||||
).values_list('subevent', flat=True)
|
||||
)
|
||||
]
|
||||
availability_date = min(dates) if dates else None
|
||||
elif self.event.has_subevents and order:
|
||||
dates = [
|
||||
availability_date.datetime(se).date()
|
||||
for se in self.event.subevents.filter(
|
||||
id__in=order.positions.values_list('subevent', flat=True)
|
||||
)
|
||||
]
|
||||
availability_date = min(dates) if dates else None
|
||||
elif self.event.has_subevents:
|
||||
logger.error('Payment provider is not subevent-ready.')
|
||||
return False
|
||||
else:
|
||||
availability_date = availability_date.datetime(self.event).date()
|
||||
|
||||
if availability_start:
|
||||
if availability_start > now_dt.astimezone(tz).date():
|
||||
return False
|
||||
if availability_date:
|
||||
return availability_date >= now_dt.astimezone(tz).date()
|
||||
|
||||
availability_end = self._absolute_availability_date(
|
||||
self.settings.get('_availability_date', as_type=RelativeDateWrapper),
|
||||
cart_id,
|
||||
order,
|
||||
aggregate_fn=min
|
||||
)
|
||||
|
||||
if availability_end:
|
||||
if availability_end < now_dt.astimezone(tz).date():
|
||||
return False
|
||||
|
||||
return True
|
||||
except NotImplementedError:
|
||||
logger.exception('Unable to check availability')
|
||||
return False
|
||||
return True
|
||||
|
||||
def is_allowed(self, request: HttpRequest, total: Decimal=None) -> bool:
|
||||
"""
|
||||
@@ -627,9 +558,9 @@ class BasePaymentProvider:
|
||||
user will not be able to select this payment method. This will only be called
|
||||
during checkout, not on retrying.
|
||||
|
||||
The default implementation checks for the ``_availability_date`` setting to be either unset or in the future
|
||||
and for the ``_availability_from``, ``_total_max``, and ``_total_min`` requirements to be met. It also checks
|
||||
the ``_restrict_countries`` and ``_restrict_to_sales_channels`` setting.
|
||||
The default implementation checks for the _availability_date setting to be either unset or in the future
|
||||
and for the _total_max and _total_min requirements to be met. It also checks the ``_restrict_countries``
|
||||
and ``_restrict_to_sales_channels`` setting.
|
||||
|
||||
:param total: The total value without the payment method fee, after taxes.
|
||||
|
||||
@@ -638,7 +569,7 @@ class BasePaymentProvider:
|
||||
The ``total`` parameter has been added. For backwards compatibility, this method is called again
|
||||
without this parameter if it raises a ``TypeError`` on first try.
|
||||
"""
|
||||
timing = self._is_available_by_time(cart_id=get_or_create_cart_id(request))
|
||||
timing = self._is_still_available(cart_id=get_or_create_cart_id(request))
|
||||
pricing = True
|
||||
|
||||
if (self.settings._total_max is not None or self.settings._total_min is not None) and total is None:
|
||||
@@ -788,7 +719,7 @@ class BasePaymentProvider:
|
||||
the amount of money that should be paid.
|
||||
|
||||
If you need any special behavior, you can return a string containing the URL the user will be redirected to.
|
||||
If you are done with your process you should return the user to the order's detail page. Redirection is only
|
||||
If you are done with your process you should return the user to the order's detail page. Redirection is not
|
||||
allowed if you set ``execute_payment_needs_user`` to ``True``.
|
||||
|
||||
If the payment is completed, you should call ``payment.confirm()``. Please note that this might
|
||||
@@ -822,8 +753,8 @@ class BasePaymentProvider:
|
||||
Will be called to check whether it is allowed to change the payment method of
|
||||
an order to this one.
|
||||
|
||||
The default implementation checks for the ``_availability_date`` setting to be either unset or in the future,
|
||||
as well as for the ``_availability_from``, ``_total_max``, ``_total_min``, and ``_restricted_countries`` settings.
|
||||
The default implementation checks for the _availability_date setting to be either unset or in the future,
|
||||
as well as for the _total_max, _total_min and _restricted_countries settings.
|
||||
|
||||
:param order: The order object
|
||||
"""
|
||||
@@ -850,7 +781,7 @@ class BasePaymentProvider:
|
||||
if order.sales_channel not in self.settings.get('_restrict_to_sales_channels', as_type=list, default=['web']):
|
||||
return False
|
||||
|
||||
return self._is_available_by_time(order=order)
|
||||
return self._is_still_available(order=order)
|
||||
|
||||
def payment_prepare(self, request: HttpRequest, payment: OrderPayment) -> Union[bool, str]:
|
||||
"""
|
||||
|
||||
@@ -43,18 +43,16 @@ import subprocess
|
||||
import tempfile
|
||||
import unicodedata
|
||||
import uuid
|
||||
from collections import OrderedDict, defaultdict
|
||||
from collections import OrderedDict
|
||||
from functools import partial
|
||||
from io import BytesIO
|
||||
|
||||
import jsonschema
|
||||
import reportlab.rl_config
|
||||
from bidi.algorithm import get_display
|
||||
from django.conf import settings
|
||||
from django.contrib.staticfiles import finders
|
||||
from django.core.exceptions import ValidationError
|
||||
from django.db.models import Max, Min
|
||||
from django.db.models.fields.files import FieldFile
|
||||
from django.dispatch import receiver
|
||||
from django.utils.deconstruct import deconstructible
|
||||
from django.utils.formats import date_format
|
||||
@@ -62,8 +60,7 @@ from django.utils.html import conditional_escape
|
||||
from django.utils.timezone import now
|
||||
from django.utils.translation import gettext_lazy as _, pgettext
|
||||
from i18nfield.strings import LazyI18nString
|
||||
from pypdf import PdfReader, PdfWriter, Transformation
|
||||
from pypdf.generic import RectangleObject
|
||||
from pypdf import PdfReader
|
||||
from reportlab.graphics import renderPDF
|
||||
from reportlab.graphics.barcode.qr import QrCodeWidget
|
||||
from reportlab.graphics.shapes import Drawing
|
||||
@@ -88,9 +85,6 @@ from pretix.presale.style import get_fonts
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
if not settings.DEBUG:
|
||||
reportlab.rl_config.shapeChecking = 0
|
||||
|
||||
|
||||
DEFAULT_VARIABLES = OrderedDict((
|
||||
("secret", {
|
||||
@@ -108,10 +102,7 @@ DEFAULT_VARIABLES = OrderedDict((
|
||||
("positionid", {
|
||||
"label": _("Order position number"),
|
||||
"editor_sample": "1",
|
||||
"evaluate": lambda orderposition, order, event: str(orderposition.positionid),
|
||||
# There is no performance gain in using evaluate_bulk here, but we want to make sure it is used somewhere
|
||||
# in core to make sure we notice if the implementation of the API breaks.
|
||||
"evaluate_bulk": lambda orderpositions: [str(p.positionid) for p in orderpositions],
|
||||
"evaluate": lambda orderposition, order, event: str(orderposition.positionid)
|
||||
}),
|
||||
("order_positionid", {
|
||||
"label": _("Order code and position number"),
|
||||
@@ -364,9 +355,14 @@ DEFAULT_VARIABLES = OrderedDict((
|
||||
}),
|
||||
("addons", {
|
||||
"label": _("List of Add-Ons"),
|
||||
"editor_sample": _("Add-on 1\n2x Add-on 2"),
|
||||
"editor_sample": _("Add-on 1\nAdd-on 2"),
|
||||
"evaluate": lambda op, order, ev: "\n".join([
|
||||
str(p) for p in generate_compressed_addon_list(op, order, ev)
|
||||
'{} - {}'.format(p.item.name, p.variation.value) if p.variation else str(p.item.name)
|
||||
for p in (
|
||||
op.addons.all() if 'addons' in getattr(op, '_prefetched_objects_cache', {})
|
||||
else op.addons.select_related('item', 'variation')
|
||||
)
|
||||
if not p.canceled
|
||||
])
|
||||
}),
|
||||
("organizer", {
|
||||
@@ -700,30 +696,6 @@ def get_seat(op: OrderPosition):
|
||||
return None
|
||||
|
||||
|
||||
def generate_compressed_addon_list(op, order, event):
|
||||
itemcount = defaultdict(int)
|
||||
addons = [p for p in (
|
||||
op.addons.all() if 'addons' in getattr(op, '_prefetched_objects_cache', {})
|
||||
else op.addons.select_related('item', 'variation')
|
||||
) if not p.canceled]
|
||||
for pos in addons:
|
||||
itemcount[pos.item, pos.variation] += 1
|
||||
|
||||
addonlist = []
|
||||
for (item, variation), count in itemcount.items():
|
||||
if variation:
|
||||
if count > 1:
|
||||
addonlist.append('{}x {} - {}'.format(count, item.name, variation.value))
|
||||
else:
|
||||
addonlist.append('{} - {}'.format(item.name, variation.value))
|
||||
else:
|
||||
if count > 1:
|
||||
addonlist.append('{}x {}'.format(count, item.name))
|
||||
else:
|
||||
addonlist.append(item.name)
|
||||
return addonlist
|
||||
|
||||
|
||||
class Renderer:
|
||||
|
||||
def __init__(self, event, layout, background_file):
|
||||
@@ -901,11 +873,6 @@ class Renderer:
|
||||
anchor='c', # centered in frame
|
||||
mask='auto'
|
||||
)
|
||||
if isinstance(image_file, FieldFile):
|
||||
# ThumbnailingImageReader "closes" the file, so it's no use to use the same file pointer
|
||||
# in case we need it again. For FieldFile, fortunately, there is an easy way to make the file
|
||||
# refresh itself when it is used next.
|
||||
del image_file.file
|
||||
except:
|
||||
logger.exception("Can not load or resize image")
|
||||
canvas.saveState()
|
||||
@@ -972,7 +939,7 @@ class Renderer:
|
||||
|
||||
# reportlab does not support unicode combination characters
|
||||
# It's important we do this before we use ArabicReshaper
|
||||
text = unicodedata.normalize("NFC", text)
|
||||
text = unicodedata.normalize("NFKC", text)
|
||||
|
||||
# reportlab does not support RTL, ligature-heavy scripts like Arabic. Therefore, we use ArabicReshaper
|
||||
# to resolve all ligatures and python-bidi to switch RTL texts.
|
||||
@@ -1025,10 +992,7 @@ class Renderer:
|
||||
elif o['type'] == "poweredby":
|
||||
self._draw_poweredby(canvas, op, o)
|
||||
if self.bg_pdf:
|
||||
page_size = (
|
||||
self.bg_pdf.pages[0].mediabox[2] - self.bg_pdf.pages[0].mediabox[0],
|
||||
self.bg_pdf.pages[0].mediabox[3] - self.bg_pdf.pages[0].mediabox[1]
|
||||
)
|
||||
page_size = (self.bg_pdf.pages[0].mediabox[2], self.bg_pdf.pages[0].mediabox[3])
|
||||
if self.bg_pdf.pages[0].get('/Rotate') in (90, 270):
|
||||
# swap dimensions due to pdf being rotated
|
||||
page_size = page_size[::-1]
|
||||
@@ -1056,12 +1020,14 @@ class Renderer:
|
||||
with open(os.path.join(d, 'out.pdf'), 'rb') as f:
|
||||
return BytesIO(f.read())
|
||||
else:
|
||||
from pypdf import PdfReader, PdfWriter, Transformation
|
||||
from pypdf.generic import RectangleObject
|
||||
buffer.seek(0)
|
||||
new_pdf = PdfReader(buffer)
|
||||
output = PdfWriter()
|
||||
|
||||
for i, page in enumerate(new_pdf.pages):
|
||||
bg_page = copy.deepcopy(self.bg_pdf.pages[i])
|
||||
bg_page = copy.copy(self.bg_pdf.pages[i])
|
||||
bg_rotation = bg_page.get('/Rotate')
|
||||
if bg_rotation:
|
||||
# /Rotate is clockwise, transformation.rotate is counter-clockwise
|
||||
@@ -1098,56 +1064,6 @@ class Renderer:
|
||||
return outbuffer
|
||||
|
||||
|
||||
def merge_background(fg_pdf, bg_pdf, out_file, compress):
|
||||
if settings.PDFTK:
|
||||
with tempfile.TemporaryDirectory() as d:
|
||||
fg_filename = os.path.join(d, 'fg.pdf')
|
||||
bg_filename = os.path.join(d, 'bg.pdf')
|
||||
fg_pdf.write(fg_filename)
|
||||
bg_pdf.write(bg_filename)
|
||||
pdftk_cmd = [
|
||||
settings.PDFTK,
|
||||
fg_filename,
|
||||
'multibackground',
|
||||
bg_filename,
|
||||
'output',
|
||||
'-',
|
||||
]
|
||||
if compress:
|
||||
pdftk_cmd.append('compress')
|
||||
subprocess.run(pdftk_cmd, check=True, stdout=out_file)
|
||||
else:
|
||||
output = PdfWriter()
|
||||
for i, page in enumerate(fg_pdf.pages):
|
||||
bg_page = copy.deepcopy(bg_pdf.pages[i])
|
||||
bg_rotation = bg_page.get('/Rotate')
|
||||
if bg_rotation:
|
||||
# /Rotate is clockwise, transformation.rotate is counter-clockwise
|
||||
t = Transformation().rotate(bg_rotation)
|
||||
w = float(page.mediabox.getWidth())
|
||||
h = float(page.mediabox.getHeight())
|
||||
if bg_rotation in (90, 270):
|
||||
# offset due to rotation base
|
||||
if bg_rotation == 90:
|
||||
t = t.translate(h, 0)
|
||||
else:
|
||||
t = t.translate(0, w)
|
||||
# rotate mediabox as well
|
||||
page.mediabox = RectangleObject((
|
||||
page.mediabox.left.as_numeric(),
|
||||
page.mediabox.bottom.as_numeric(),
|
||||
page.mediabox.top.as_numeric(),
|
||||
page.mediabox.right.as_numeric(),
|
||||
))
|
||||
page.trimbox = page.mediabox
|
||||
elif bg_rotation == 180:
|
||||
t = t.translate(w, h)
|
||||
page.add_transformation(t)
|
||||
bg_page.merge_page(page)
|
||||
output.add_page(bg_page)
|
||||
output.write(out_file)
|
||||
|
||||
|
||||
@deconstructible
|
||||
class PdfLayoutValidator:
|
||||
def __call__(self, value):
|
||||
|
||||
@@ -36,7 +36,6 @@ import uuid
|
||||
from collections import Counter, defaultdict, namedtuple
|
||||
from datetime import datetime, time, timedelta
|
||||
from decimal import Decimal
|
||||
from time import sleep
|
||||
from typing import List, Optional
|
||||
|
||||
from celery.exceptions import MaxRetriesExceededError
|
||||
@@ -63,7 +62,7 @@ from pretix.base.models.orders import OrderFee
|
||||
from pretix.base.models.tax import TaxRule
|
||||
from pretix.base.reldate import RelativeDateWrapper
|
||||
from pretix.base.services.checkin import _save_answers
|
||||
from pretix.base.services.locking import LockTimeoutException, lock_objects
|
||||
from pretix.base.services.locking import LockTimeoutException, NoLockManager
|
||||
from pretix.base.services.pricing import (
|
||||
apply_discounts, get_line_price, get_listed_price, get_price,
|
||||
is_included_for_free,
|
||||
@@ -77,7 +76,6 @@ from pretix.celery_app import app
|
||||
from pretix.presale.signals import (
|
||||
checkout_confirm_messages, fee_calculation_for_cart,
|
||||
)
|
||||
from pretix.testutils.middleware import debugflags_var
|
||||
|
||||
|
||||
class CartError(Exception):
|
||||
@@ -143,10 +141,9 @@ error_messages = {
|
||||
'price_not_a_number': gettext_lazy('The entered price is not a number.'),
|
||||
'price_too_high': gettext_lazy('The entered price is to high.'),
|
||||
'voucher_invalid': gettext_lazy('This voucher code is not known in our database.'),
|
||||
'voucher_min_usages': ngettext_lazy(
|
||||
'The voucher code "%(voucher)s" can only be used if you select at least %(number)s matching products.',
|
||||
'The voucher code "%(voucher)s" can only be used if you select at least %(number)s matching products.',
|
||||
'number'
|
||||
'voucher_min_usages': gettext_lazy(
|
||||
'The voucher code "%(voucher)s" can only be used if you select at least %(number)s '
|
||||
'matching products.'
|
||||
),
|
||||
'voucher_min_usages_removed': ngettext_lazy(
|
||||
'The voucher code "%(voucher)s" can only be used if you select at least %(number)s matching products. '
|
||||
@@ -1075,43 +1072,23 @@ class CartManager:
|
||||
)
|
||||
return err
|
||||
|
||||
@transaction.atomic(durable=True)
|
||||
def _perform_operations(self):
|
||||
full_lock_required = any(getattr(o, 'seat', False) for o in self._operations) and self.event.settings.seating_minimal_distance > 0
|
||||
if full_lock_required:
|
||||
# We lock the entire event in this case since we don't want to deal with fine-granular locking
|
||||
# in the case of seating distance enforcement
|
||||
lock_objects([self.event])
|
||||
else:
|
||||
lock_objects(
|
||||
[q for q, d in self._quota_diff.items() if q.size is not None and d > 0] +
|
||||
[v for v, d in self._voucher_use_diff.items() if d > 0] +
|
||||
[getattr(o, 'seat', False) for o in self._operations if getattr(o, 'seat', False)],
|
||||
shared_lock_objects=[self.event]
|
||||
)
|
||||
vouchers_ok = self._get_voucher_availability()
|
||||
quotas_ok = _get_quota_availability(self._quota_diff, self.now_dt)
|
||||
err = None
|
||||
new_cart_positions = []
|
||||
deleted_positions = set()
|
||||
|
||||
err = err or self._check_min_max_per_product()
|
||||
|
||||
self._operations.sort(key=lambda a: self.order[type(a)])
|
||||
seats_seen = set()
|
||||
|
||||
if 'sleep-after-quota-check' in debugflags_var.get():
|
||||
sleep(2)
|
||||
|
||||
for iop, op in enumerate(self._operations):
|
||||
if isinstance(op, self.RemoveOperation):
|
||||
if op.position.expires > self.now_dt:
|
||||
for q in op.position.quotas:
|
||||
quotas_ok[q] += 1
|
||||
addons = op.position.addons.all()
|
||||
deleted_positions |= {a.pk for a in addons}
|
||||
addons.delete()
|
||||
deleted_positions.add(op.position.pk)
|
||||
op.position.addons.all().delete()
|
||||
op.position.delete()
|
||||
|
||||
elif isinstance(op, (self.AddOperation, self.ExtendOperation)):
|
||||
@@ -1261,28 +1238,20 @@ class CartManager:
|
||||
if op.seat and not op.seat.is_available(ignore_cart=op.position, sales_channel=self._sales_channel,
|
||||
ignore_voucher_id=op.position.voucher_id):
|
||||
err = err or error_messages['seat_unavailable']
|
||||
|
||||
addons = op.position.addons.all()
|
||||
deleted_positions |= {a.pk for a in addons}
|
||||
deleted_positions.add(op.position.pk)
|
||||
addons.delete()
|
||||
op.position.addons.all().delete()
|
||||
op.position.delete()
|
||||
elif available_count == 1:
|
||||
op.position.expires = self._expiry
|
||||
op.position.listed_price = op.listed_price
|
||||
op.position.price_after_voucher = op.price_after_voucher
|
||||
# op.position.price will be updated by recompute_final_prices_and_taxes()
|
||||
if op.position.pk not in deleted_positions:
|
||||
try:
|
||||
op.position.save(force_update=True, update_fields=['expires', 'listed_price', 'price_after_voucher'])
|
||||
except DatabaseError:
|
||||
# Best effort... The position might have been deleted in the meantime!
|
||||
pass
|
||||
try:
|
||||
op.position.save(force_update=True, update_fields=['expires', 'listed_price', 'price_after_voucher'])
|
||||
except DatabaseError:
|
||||
# Best effort... The position might have been deleted in the meantime!
|
||||
pass
|
||||
elif available_count == 0:
|
||||
addons = op.position.addons.all()
|
||||
deleted_positions |= {a.pk for a in addons}
|
||||
deleted_positions.add(op.position.pk)
|
||||
addons.delete()
|
||||
op.position.addons.all().delete()
|
||||
op.position.delete()
|
||||
else:
|
||||
raise AssertionError("ExtendOperation cannot affect more than one item")
|
||||
@@ -1307,11 +1276,22 @@ class CartManager:
|
||||
p.save()
|
||||
_save_answers(p, {}, p._answers)
|
||||
CartPosition.objects.bulk_create([p for p in new_cart_positions if not getattr(p, '_answers', None) and not p.pk])
|
||||
|
||||
if 'sleep-before-commit' in debugflags_var.get():
|
||||
sleep(2)
|
||||
return err
|
||||
|
||||
def _require_locking(self):
|
||||
if self._voucher_use_diff:
|
||||
# If any vouchers are used, we lock to make sure we don't redeem them to often
|
||||
return True
|
||||
|
||||
if self._quota_diff and any(q.size is not None for q in self._quota_diff):
|
||||
# If any quotas are affected that are not unlimited, we lock
|
||||
return True
|
||||
|
||||
if any(getattr(o, 'seat', False) for o in self._operations):
|
||||
return True
|
||||
|
||||
return False
|
||||
|
||||
def recompute_final_prices_and_taxes(self):
|
||||
positions = sorted(list(self.positions), key=lambda op: -(op.addon_to_id or 0))
|
||||
diff = Decimal('0.00')
|
||||
@@ -1350,14 +1330,18 @@ class CartManager:
|
||||
err = self.extend_expired_positions() or err
|
||||
err = err or self._check_min_per_voucher()
|
||||
|
||||
self.now_dt = now()
|
||||
lockfn = NoLockManager
|
||||
if self._require_locking():
|
||||
lockfn = self.event.lock
|
||||
|
||||
self._extend_expiry_of_valid_existing_positions()
|
||||
err = self._perform_operations() or err
|
||||
self.recompute_final_prices_and_taxes()
|
||||
|
||||
if err:
|
||||
raise CartError(err)
|
||||
with lockfn() as now_dt:
|
||||
with transaction.atomic():
|
||||
self.now_dt = now_dt
|
||||
self._extend_expiry_of_valid_existing_positions()
|
||||
err = self._perform_operations() or err
|
||||
self.recompute_final_prices_and_taxes()
|
||||
if err:
|
||||
raise CartError(err)
|
||||
|
||||
|
||||
def add_payment_to_cart(request, provider, min_value: Decimal=None, max_value: Decimal=None, info_data: dict=None):
|
||||
|
||||
@@ -86,8 +86,8 @@ def _build_time(t=None, value=None, ev=None, now_dt=None):
|
||||
return ev.date_admission or ev.date_from
|
||||
|
||||
|
||||
def _logic_annotate_for_graphic_explain(rules, ev, rule_data, now_dt):
|
||||
logic_environment = _get_logic_environment(ev, rule_data, now_dt)
|
||||
def _logic_annotate_for_graphic_explain(rules, ev, rule_data):
|
||||
logic_environment = _get_logic_environment(ev)
|
||||
event = ev if isinstance(ev, Event) else ev.event
|
||||
|
||||
def _evaluate_inners(r):
|
||||
@@ -112,8 +112,6 @@ def _logic_annotate_for_graphic_explain(rules, ev, rule_data, now_dt):
|
||||
val = str(event.items.get(pk=val))
|
||||
elif var == "variation":
|
||||
val = str(ItemVariation.objects.get(item__event=event, pk=val))
|
||||
elif var == "gate":
|
||||
val = str(event.organizer.gates.get(pk=val))
|
||||
elif isinstance(val, datetime):
|
||||
val = date_format(val.astimezone(ev.timezone), "SHORT_DATETIME_FORMAT")
|
||||
return {"var": var, "__result": val}
|
||||
@@ -154,7 +152,7 @@ def _logic_explain(rules, ev, rule_data, now_dt=None):
|
||||
get in before 17:00". In the middle of the night it would switch to "You can only get in after 09:00".
|
||||
"""
|
||||
now_dt = now_dt or now()
|
||||
logic_environment = _get_logic_environment(ev, rule_data, now_dt)
|
||||
logic_environment = _get_logic_environment(ev)
|
||||
_var_values = {'False': False, 'True': True}
|
||||
_var_explanations = {}
|
||||
|
||||
@@ -176,22 +174,15 @@ def _logic_explain(rules, ev, rule_data, now_dt=None):
|
||||
_var_values[new_var_name] = result
|
||||
if not result:
|
||||
# Operator returned false, let's dig deeper
|
||||
if "var" in values[0]:
|
||||
if isinstance(values[0]["var"], list):
|
||||
values[0]["var"] = values[0]["var"][0]
|
||||
_var_explanations[new_var_name] = {
|
||||
'operator': operator,
|
||||
'var': values[0]["var"],
|
||||
'rhs': values[1:],
|
||||
}
|
||||
elif "entries_since" in values[0] or "entries_before" in values[0]:
|
||||
_var_explanations[new_var_name] = {
|
||||
'operator': operator,
|
||||
'var': values[0],
|
||||
'rhs': values[1:],
|
||||
}
|
||||
else:
|
||||
if "var" not in values[0]:
|
||||
raise ValueError("Binary operators should be normalized to have a variable on their left-hand side")
|
||||
if isinstance(values[0]["var"], list):
|
||||
values[0]["var"] = values[0]["var"][0]
|
||||
_var_explanations[new_var_name] = {
|
||||
'operator': operator,
|
||||
'var': values[0]["var"],
|
||||
'rhs': values[1:],
|
||||
}
|
||||
return {'var': new_var_name}
|
||||
try:
|
||||
rules = _evaluate_inners(rules)
|
||||
@@ -238,7 +229,7 @@ def _logic_explain(rules, ev, rule_data, now_dt=None):
|
||||
for vname, data in _var_explanations.items():
|
||||
var, operator, rhs = data['var'], data['operator'], data['rhs']
|
||||
if var == 'now':
|
||||
compare_to = _build_time(*rhs[0]['buildTime'], ev=ev, now_dt=now_dt).astimezone(ev.timezone)
|
||||
compare_to = _build_time(*rhs[0]['buildTime'], ev=ev).astimezone(ev.timezone)
|
||||
tolerance = timedelta(minutes=float(rhs[1])) if len(rhs) > 1 and rhs[1] else timedelta(seconds=0)
|
||||
if operator == 'isBefore':
|
||||
compare_to += tolerance
|
||||
@@ -258,17 +249,11 @@ def _logic_explain(rules, ev, rule_data, now_dt=None):
|
||||
elif var == 'product' or var == 'variation':
|
||||
var_weights[vname] = (1000, 0)
|
||||
var_texts[vname] = _('Ticket type not allowed')
|
||||
elif var == 'gate':
|
||||
var_weights[vname] = (500, 0)
|
||||
var_texts[vname] = _('Wrong entrance gate')
|
||||
elif var in ('entries_number', 'entries_today', 'entries_days', 'minutes_since_last_entry', 'minutes_since_first_entry', 'now_isoweekday') \
|
||||
or (isinstance(var, dict) and ("entries_since" in var or "entries_before" in var)):
|
||||
elif var in ('entries_number', 'entries_today', 'entries_days', 'minutes_since_last_entry', 'minutes_since_first_entry', 'now_isoweekday'):
|
||||
w = {
|
||||
'minutes_since_first_entry': 80,
|
||||
'minutes_since_last_entry': 90,
|
||||
'entries_days': 100,
|
||||
'entries_since': 110,
|
||||
'entries_before': 110,
|
||||
'entries_number': 120,
|
||||
'entries_today': 140,
|
||||
'now_isoweekday': 210,
|
||||
@@ -287,24 +272,8 @@ def _logic_explain(rules, ev, rule_data, now_dt=None):
|
||||
'entries_days': _('number of days with an entry'),
|
||||
'entries_number': _('number of entries'),
|
||||
'entries_today': _('number of entries today'),
|
||||
'entries_since': _('number of entries since {datetime}'),
|
||||
'entries_before': _('number of entries before {datetime}'),
|
||||
'now_isoweekday': _('week day'),
|
||||
}
|
||||
|
||||
if isinstance(var, dict) and ("entries_since" in var or "entries_before" in var):
|
||||
varname = list(var.keys())[0]
|
||||
cutoff = _build_time(*var[varname][0]['buildTime'], ev=ev, now_dt=now_dt).astimezone(ev.timezone)
|
||||
if abs(now_dt - cutoff) < timedelta(hours=12):
|
||||
compare_to_text = date_format(cutoff, 'TIME_FORMAT')
|
||||
else:
|
||||
compare_to_text = date_format(cutoff, 'SHORT_DATETIME_FORMAT')
|
||||
l[varname] = str(l[varname].format(datetime=compare_to_text))
|
||||
var = varname
|
||||
var_result = getattr(rule_data, var)(cutoff)
|
||||
else:
|
||||
var_result = rule_data[var]
|
||||
|
||||
compare_to = rhs[0]
|
||||
penalty = 0
|
||||
|
||||
@@ -321,7 +290,7 @@ def _logic_explain(rules, ev, rule_data, now_dt=None):
|
||||
# These are "technical" comparisons without real meaning, we don't want to show them.
|
||||
penalty = 1000
|
||||
|
||||
var_weights[vname] = (w[var] + operator_weights.get(operator, 0) + penalty, abs(compare_to - var_result))
|
||||
var_weights[vname] = (w[var] + operator_weights.get(operator, 0) + penalty, abs(compare_to - rule_data[var]))
|
||||
|
||||
if var == 'now_isoweekday':
|
||||
compare_to = {
|
||||
@@ -368,14 +337,12 @@ def _logic_explain(rules, ev, rule_data, now_dt=None):
|
||||
return ', '.join(var_texts[v] for v in paths_with_min_weight[0] if not _var_values[v])
|
||||
|
||||
|
||||
def _get_logic_environment(ev, rule_data, now_dt):
|
||||
def _get_logic_environment(ev):
|
||||
# Every change to our supported JSON logic must be done
|
||||
# * in pretix.base.services.checkin
|
||||
# * in pretix.base.models.checkin
|
||||
# * in pretix.helpers.jsonlogic_boolalg
|
||||
# * in checkinrules.js
|
||||
# * in libpretixsync
|
||||
# * in pretixscan-ios
|
||||
|
||||
def is_before(t1, t2, tolerance=None):
|
||||
if tolerance:
|
||||
@@ -387,21 +354,17 @@ def _get_logic_environment(ev, rule_data, now_dt):
|
||||
logic.add_operation('objectList', lambda *objs: list(objs))
|
||||
logic.add_operation('lookup', lambda model, pk, str: int(pk))
|
||||
logic.add_operation('inList', lambda a, b: a in b)
|
||||
logic.add_operation('buildTime', partial(_build_time, ev=ev, now_dt=now_dt))
|
||||
logic.add_operation('buildTime', partial(_build_time, ev=ev))
|
||||
logic.add_operation('isBefore', is_before)
|
||||
logic.add_operation('isAfter', lambda t1, t2, tol=None: is_before(t2, t1, tol))
|
||||
logic.add_operation('entries_since', lambda t1: rule_data.entries_since(t1))
|
||||
logic.add_operation('entries_before', lambda t1: rule_data.entries_before(t1))
|
||||
return logic
|
||||
|
||||
|
||||
class LazyRuleVars:
|
||||
def __init__(self, position, clist, dt, gate):
|
||||
def __init__(self, position, clist, dt):
|
||||
self._position = position
|
||||
self._clist = clist
|
||||
self._dt = dt
|
||||
self._gate = gate
|
||||
self.__cache = {}
|
||||
|
||||
def __getitem__(self, item):
|
||||
if item[0] != '_' and hasattr(self, item):
|
||||
@@ -417,10 +380,6 @@ class LazyRuleVars:
|
||||
tz = self._clist.event.timezone
|
||||
return self._dt.astimezone(tz).isoweekday()
|
||||
|
||||
@property
|
||||
def gate(self):
|
||||
return self._gate.pk if self._gate else None
|
||||
|
||||
@property
|
||||
def product(self):
|
||||
return self._position.item_id
|
||||
@@ -439,16 +398,6 @@ class LazyRuleVars:
|
||||
midnight = self._dt.astimezone(tz).replace(hour=0, minute=0, second=0, microsecond=0)
|
||||
return self._position.checkins.filter(type=Checkin.TYPE_ENTRY, list=self._clist, datetime__gte=midnight).count()
|
||||
|
||||
def entries_since(self, cutoff):
|
||||
if ('entries_since', cutoff) not in self.__cache:
|
||||
self.__cache['entries_since', cutoff] = self._position.checkins.filter(type=Checkin.TYPE_ENTRY, list=self._clist, datetime__gte=cutoff).count()
|
||||
return self.__cache['entries_since', cutoff]
|
||||
|
||||
def entries_before(self, cutoff):
|
||||
if ('entries_before', cutoff) not in self.__cache:
|
||||
self.__cache['entries_before', cutoff] = self._position.checkins.filter(type=Checkin.TYPE_ENTRY, list=self._clist, datetime__lt=cutoff).count()
|
||||
return self.__cache['entries_before', cutoff]
|
||||
|
||||
@cached_property
|
||||
def entries_days(self):
|
||||
tz = self._clist.event.timezone
|
||||
@@ -497,9 +446,8 @@ class SQLLogic:
|
||||
* Comparison operators (==, !=, …) never contain boolean operators (and, or) further down in the stack
|
||||
"""
|
||||
|
||||
def __init__(self, list, gate=None):
|
||||
def __init__(self, list):
|
||||
self.list = list
|
||||
self.gate = gate
|
||||
self.bool_ops = {
|
||||
"and": lambda *args: reduce(lambda total, arg: total & arg, args) if args else Q(),
|
||||
"or": lambda *args: reduce(lambda total, arg: total | arg, args) if args else Q(),
|
||||
@@ -515,7 +463,7 @@ class SQLLogic:
|
||||
"isBefore": partial(self.comparison_to_q, operator=LowerThan, modifier=partial(tolerance, sign=1)),
|
||||
"isAfter": partial(self.comparison_to_q, operator=GreaterThan, modifier=partial(tolerance, sign=-1)),
|
||||
}
|
||||
self.expression_ops = {'buildTime', 'objectList', 'lookup', 'var', 'entries_since', 'entries_before'}
|
||||
self.expression_ops = {'buildTime', 'objectList', 'lookup', 'var'}
|
||||
|
||||
def operation_to_expression(self, rule):
|
||||
if not isinstance(rule, dict):
|
||||
@@ -563,36 +511,6 @@ class SQLLogic:
|
||||
return [self.operation_to_expression(v) for v in values]
|
||||
elif operator == 'lookup':
|
||||
return int(values[1])
|
||||
elif operator == 'entries_since':
|
||||
return Coalesce(
|
||||
Subquery(
|
||||
Checkin.objects.filter(
|
||||
position_id=OuterRef('pk'),
|
||||
type=Checkin.TYPE_ENTRY,
|
||||
list_id=self.list.pk,
|
||||
datetime__gte=self.operation_to_expression(values[0]),
|
||||
).values('position_id').order_by().annotate(
|
||||
c=Count('*')
|
||||
).values('c')
|
||||
),
|
||||
Value(0),
|
||||
output_field=IntegerField()
|
||||
)
|
||||
elif operator == 'entries_before':
|
||||
return Coalesce(
|
||||
Subquery(
|
||||
Checkin.objects.filter(
|
||||
position_id=OuterRef('pk'),
|
||||
type=Checkin.TYPE_ENTRY,
|
||||
list_id=self.list.pk,
|
||||
datetime__lt=self.operation_to_expression(values[0]),
|
||||
).values('position_id').order_by().annotate(
|
||||
c=Count('*')
|
||||
).values('c')
|
||||
),
|
||||
Value(0),
|
||||
output_field=IntegerField()
|
||||
)
|
||||
elif operator == 'var':
|
||||
if values[0] == 'now':
|
||||
return Value(now().astimezone(timezone.utc))
|
||||
@@ -602,8 +520,6 @@ class SQLLogic:
|
||||
return F('item_id')
|
||||
elif values[0] == 'variation':
|
||||
return F('variation_id')
|
||||
elif values[0] == 'gate':
|
||||
return Value(self.gate.pk if self.gate else None)
|
||||
elif values[0] == 'entries_number':
|
||||
return Coalesce(
|
||||
Subquery(
|
||||
@@ -815,8 +731,7 @@ def _save_answers(op, answers, given_answers):
|
||||
def perform_checkin(op: OrderPosition, clist: CheckinList, given_answers: dict, force=False,
|
||||
ignore_unpaid=False, nonce=None, datetime=None, questions_supported=True,
|
||||
user=None, auth=None, canceled_supported=False, type=Checkin.TYPE_ENTRY,
|
||||
raw_barcode=None, raw_source_type=None, from_revoked_secret=False, simulate=False,
|
||||
gate=None):
|
||||
raw_barcode=None, raw_source_type=None, from_revoked_secret=False, simulate=False):
|
||||
"""
|
||||
Create a checkin for this particular order position and check-in list. Fails with CheckInError if the check in is
|
||||
not valid at this time.
|
||||
@@ -831,7 +746,6 @@ def perform_checkin(op: OrderPosition, clist: CheckinList, given_answers: dict,
|
||||
:param nonce: A random nonce to prevent race conditions.
|
||||
:param datetime: The datetime of the checkin, defaults to now.
|
||||
:param simulate: If true, the check-in is not saved.
|
||||
:param gate: The gate the check-in was performed at.
|
||||
"""
|
||||
|
||||
# !!!!!!!!!
|
||||
@@ -946,8 +860,8 @@ def perform_checkin(op: OrderPosition, clist: CheckinList, given_answers: dict,
|
||||
)
|
||||
|
||||
if type == Checkin.TYPE_ENTRY and clist.rules:
|
||||
rule_data = LazyRuleVars(op, clist, dt, gate=gate)
|
||||
logic = _get_logic_environment(op.subevent or clist.event, rule_data, now_dt=dt)
|
||||
rule_data = LazyRuleVars(op, clist, dt)
|
||||
logic = _get_logic_environment(op.subevent or clist.event)
|
||||
if not logic.apply(clist.rules, rule_data):
|
||||
if force:
|
||||
force_used = True
|
||||
@@ -971,10 +885,8 @@ def perform_checkin(op: OrderPosition, clist: CheckinList, given_answers: dict,
|
||||
device = None
|
||||
if isinstance(auth, Device):
|
||||
device = auth
|
||||
if not gate:
|
||||
gate = device.gate
|
||||
|
||||
last_cis = list(op.checkins.order_by('-datetime').filter(list=clist).only('type', 'nonce', 'position_id'))
|
||||
last_cis = list(op.checkins.order_by('-datetime').filter(list=clist).only('type', 'nonce'))
|
||||
entry_allowed = (
|
||||
type == Checkin.TYPE_EXIT or
|
||||
clist.allow_multiple_entries or
|
||||
@@ -996,7 +908,7 @@ def perform_checkin(op: OrderPosition, clist: CheckinList, given_answers: dict,
|
||||
list=clist,
|
||||
datetime=dt,
|
||||
device=device,
|
||||
gate=gate,
|
||||
gate=device.gate if device else None,
|
||||
nonce=nonce,
|
||||
forced=force and (not entry_allowed or from_revoked_secret or force_used),
|
||||
force_sent=force,
|
||||
|
||||
@@ -26,7 +26,7 @@ from typing import Any, Dict, Union
|
||||
from celery.exceptions import MaxRetriesExceededError
|
||||
from django.conf import settings
|
||||
from django.core.files.base import ContentFile
|
||||
from django.db import close_old_connections, connection, transaction
|
||||
from django.db import connection, transaction
|
||||
from django.dispatch import receiver
|
||||
from django.utils.timezone import now, override
|
||||
from django.utils.translation import gettext
|
||||
@@ -86,12 +86,9 @@ def export(self, event: Event, fileid: str, provider: str, form_data: Dict[str,
|
||||
gettext('Your export did not contain any data.')
|
||||
)
|
||||
file.filename, file.type, data = d
|
||||
|
||||
close_old_connections() # This task can run very long, we might need a new DB connection
|
||||
|
||||
f = ContentFile(data)
|
||||
file.file.save(cachedfile_name(file, file.filename), f)
|
||||
return str(file.pk)
|
||||
return file.pk
|
||||
|
||||
|
||||
@app.task(base=ProfiledOrganizerUserTask, throws=(ExportError,), bind=True)
|
||||
@@ -157,12 +154,9 @@ def multiexport(self, organizer: Organizer, user: User, device: int, token: int,
|
||||
gettext('Your export did not contain any data.')
|
||||
)
|
||||
file.filename, file.type, data = d
|
||||
|
||||
close_old_connections() # This task can run very long, we might need a new DB connection
|
||||
|
||||
f = ContentFile(data)
|
||||
file.file.save(cachedfile_name(file, file.filename), f)
|
||||
return str(file.pk)
|
||||
return file.pk
|
||||
|
||||
|
||||
def _run_scheduled_export(schedule, context: Union[Event, Organizer], exporter, config_url, retry_func, has_permission):
|
||||
@@ -220,11 +214,6 @@ def _run_scheduled_export(schedule, context: Union[Event, Organizer], exporter,
|
||||
raise ExportError(
|
||||
gettext('Your exported data exceeded the size limit for scheduled exports.')
|
||||
)
|
||||
|
||||
conn = transaction.get_connection()
|
||||
if not conn.in_atomic_block: # atomic execution only happens during tests or with celery always_eager on
|
||||
close_old_connections() # This task can run very long, we might need a new DB connection
|
||||
|
||||
f = ContentFile(data)
|
||||
file.file.save(cachedfile_name(file, file.filename), f)
|
||||
except ExportEmptyError as e:
|
||||
|
||||
@@ -20,105 +20,31 @@
|
||||
# <https://www.gnu.org/licenses/>.
|
||||
#
|
||||
|
||||
# This file is based on an earlier version of pretix which was released under the Apache License 2.0. The full text of
|
||||
# the Apache License 2.0 can be obtained at <http://www.apache.org/licenses/LICENSE-2.0>.
|
||||
#
|
||||
# This file may have since been changed and any changes are released under the terms of AGPLv3 as described above. A
|
||||
# full history of changes and contributors is available at <https://github.com/pretix/pretix>.
|
||||
#
|
||||
# This file contains Apache-licensed contributions copyrighted by: Tobias Kunze
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software distributed under the Apache License 2.0 is
|
||||
# distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations under the License.
|
||||
|
||||
import logging
|
||||
from itertools import groupby
|
||||
import time
|
||||
import uuid
|
||||
from datetime import timedelta
|
||||
|
||||
from django.conf import settings
|
||||
from django.db import DatabaseError, connection
|
||||
from django.db import transaction
|
||||
from django.utils.timezone import now
|
||||
|
||||
from pretix.base.models import Event, Membership, Quota, Seat, Voucher
|
||||
from pretix.testutils.middleware import debugflags_var
|
||||
from pretix.base.models import EventLock
|
||||
|
||||
logger = logging.getLogger('pretix.base.locking')
|
||||
|
||||
# A lock acquisition is aborted if it takes longer than LOCK_ACQUISITION_TIMEOUT to prevent connection starvation
|
||||
LOCK_ACQUISITION_TIMEOUT = 3
|
||||
|
||||
# We make the assumption that it is safe to e.g. transform an order into a cart if the order has a lifetime of more than
|
||||
# LOCK_TRUST_WINDOW into the future. In other words, we assume that a lock is never held longer than LOCK_TRUST_WINDOW.
|
||||
# This assumption holds true for all in-request locks, since our gunicorn default settings kill a worker that takes
|
||||
# longer than 60 seconds to process a request. It however does not hold true for celery tasks, especially long-running
|
||||
# ones, so this does introduce *some* risk of incorrect locking.
|
||||
LOCK_TRUST_WINDOW = 120
|
||||
|
||||
# These are different offsets for the different types of keys we want to lock
|
||||
KEY_SPACES = {
|
||||
Event: 1,
|
||||
Quota: 2,
|
||||
Seat: 3,
|
||||
Voucher: 4,
|
||||
Membership: 5
|
||||
}
|
||||
|
||||
|
||||
def pg_lock_key(obj):
|
||||
"""
|
||||
This maps the primary key space of multiple tables to a single bigint key space within postgres. It is not
|
||||
an injective function, which is fine, as long as collisions are rare.
|
||||
"""
|
||||
keyspace = KEY_SPACES.get(type(obj))
|
||||
objectid = obj.pk
|
||||
if not keyspace:
|
||||
raise ValueError(f"No key space defined for locking objects of type {type(obj)}")
|
||||
assert isinstance(objectid, int)
|
||||
# 64bit int: xxxxxxxx xxxxxxx xxxxxxx xxxxxxx xxxxxx xxxxxxx xxxxxxx xxxxxxx
|
||||
# | objectid mod 2**48 | |index| |keysp.|
|
||||
key = ((objectid % 281474976710656) << 16) | ((settings.DATABASE_ADVISORY_LOCK_INDEX % 256) << 8) | (keyspace % 256)
|
||||
return key
|
||||
|
||||
|
||||
class LockTimeoutException(Exception):
|
||||
pass
|
||||
|
||||
|
||||
def lock_objects(objects, *, shared_lock_objects=None, replace_exclusive_with_shared_when_exclusive_are_more_than=20):
|
||||
"""
|
||||
Create an exclusive lock on the objects passed in `objects`. This function MUST be called within an atomic
|
||||
transaction and SHOULD be called only once per transaction to prevent deadlocks.
|
||||
|
||||
A shared lock will be created on objects passed in `shared_lock_objects`.
|
||||
|
||||
If `objects` contains more than `replace_exclusive_with_shared_when_exclusive_are_more_than` objects, `objects`
|
||||
will be ignored and `shared_lock_objects` will be used in its place and receive an exclusive lock.
|
||||
|
||||
The idea behind it is this: Usually we create a lock on every quota, voucher, or seat contained in an order.
|
||||
However, this has a large performance penalty in case we have hundreds of locks required. Therefore, we always
|
||||
place a shared lock in the event, and if we have too many affected objects, we fall back to event-level locks.
|
||||
"""
|
||||
if (not objects and not shared_lock_objects) or 'skip-locking' in debugflags_var.get():
|
||||
return
|
||||
|
||||
if 'fail-locking' in debugflags_var.get():
|
||||
raise LockTimeoutException()
|
||||
|
||||
if not connection.in_atomic_block:
|
||||
raise RuntimeError(
|
||||
"You cannot create locks outside of an transaction"
|
||||
)
|
||||
|
||||
if 'postgresql' in settings.DATABASES['default']['ENGINE']:
|
||||
shared_keys = set(pg_lock_key(obj) for obj in shared_lock_objects) if shared_lock_objects else set()
|
||||
exclusive_keys = set(pg_lock_key(obj) for obj in objects)
|
||||
if replace_exclusive_with_shared_when_exclusive_are_more_than and len(exclusive_keys) > replace_exclusive_with_shared_when_exclusive_are_more_than:
|
||||
exclusive_keys = shared_keys
|
||||
keys = sorted(list(shared_keys | exclusive_keys))
|
||||
calls = ", ".join([
|
||||
(f"pg_advisory_xact_lock({k})" if k in exclusive_keys else f"pg_advisory_xact_lock_shared({k})") for k in keys
|
||||
])
|
||||
|
||||
try:
|
||||
with connection.cursor() as cursor:
|
||||
cursor.execute(f"SET LOCAL lock_timeout = '{LOCK_ACQUISITION_TIMEOUT}s';")
|
||||
cursor.execute(f"SELECT {calls};")
|
||||
cursor.execute("SET LOCAL lock_timeout = '0';") # back to default
|
||||
except DatabaseError as e:
|
||||
logger.warning(f"Waiting for locks timed out: {e} on SELECT {calls};")
|
||||
raise LockTimeoutException()
|
||||
|
||||
else:
|
||||
for model, instances in groupby(objects, key=lambda o: type(o)):
|
||||
model.objects.select_for_update().filter(pk__in=[o.pk for o in instances])
|
||||
LOCK_TIMEOUT = 120
|
||||
|
||||
|
||||
class NoLockManager:
|
||||
@@ -131,3 +57,128 @@ class NoLockManager:
|
||||
def __exit__(self, exc_type, exc_val, exc_tb):
|
||||
if exc_type is not None:
|
||||
return False
|
||||
|
||||
|
||||
class LockManager:
|
||||
def __init__(self, event):
|
||||
self.event = event
|
||||
|
||||
def __enter__(self):
|
||||
lock_event(self.event)
|
||||
return now()
|
||||
|
||||
def __exit__(self, exc_type, exc_val, exc_tb):
|
||||
release_event(self.event)
|
||||
if exc_type is not None:
|
||||
return False
|
||||
|
||||
|
||||
class LockTimeoutException(Exception):
|
||||
pass
|
||||
|
||||
|
||||
class LockReleaseException(LockTimeoutException):
|
||||
pass
|
||||
|
||||
|
||||
def lock_event(event):
|
||||
"""
|
||||
Issue a lock on this event so nobody can book tickets for this event until
|
||||
you release the lock. Will retry 5 times on failure.
|
||||
|
||||
:raises LockTimeoutException: if the event is locked every time we try
|
||||
to obtain the lock
|
||||
"""
|
||||
if hasattr(event, '_lock') and event._lock:
|
||||
return True
|
||||
|
||||
if settings.HAS_REDIS:
|
||||
return lock_event_redis(event)
|
||||
else:
|
||||
return lock_event_db(event)
|
||||
|
||||
|
||||
def release_event(event):
|
||||
"""
|
||||
Release a lock placed by :py:meth:`lock()`. If the parameter force is not set to ``True``,
|
||||
the lock will only be released if it was issued in _this_ python
|
||||
representation of the database object.
|
||||
|
||||
:raises LockReleaseException: if we do not own the lock
|
||||
"""
|
||||
if not hasattr(event, '_lock') or not event._lock:
|
||||
raise LockReleaseException('Lock is not owned by this thread')
|
||||
if settings.HAS_REDIS:
|
||||
return release_event_redis(event)
|
||||
else:
|
||||
return release_event_db(event)
|
||||
|
||||
|
||||
def lock_event_db(event):
|
||||
retries = 5
|
||||
for i in range(retries):
|
||||
with transaction.atomic():
|
||||
dt = now()
|
||||
l, created = EventLock.objects.get_or_create(event=event.id)
|
||||
if created:
|
||||
event._lock = l
|
||||
return True
|
||||
elif l.date < now() - timedelta(seconds=LOCK_TIMEOUT):
|
||||
newtoken = str(uuid.uuid4())
|
||||
updated = EventLock.objects.filter(event=event.id, token=l.token).update(date=dt, token=newtoken)
|
||||
if updated:
|
||||
l.token = newtoken
|
||||
event._lock = l
|
||||
return True
|
||||
time.sleep(2 ** i / 100)
|
||||
raise LockTimeoutException()
|
||||
|
||||
|
||||
@transaction.atomic
|
||||
def release_event_db(event):
|
||||
if not hasattr(event, '_lock') or not event._lock:
|
||||
raise LockReleaseException('Lock is not owned by this thread')
|
||||
try:
|
||||
lock = EventLock.objects.get(event=event.id, token=event._lock.token)
|
||||
lock.delete()
|
||||
event._lock = None
|
||||
except EventLock.DoesNotExist:
|
||||
raise LockReleaseException('Lock is no longer owned by this thread')
|
||||
|
||||
|
||||
def redis_lock_from_event(event):
|
||||
from django_redis import get_redis_connection
|
||||
from redis.lock import Lock
|
||||
|
||||
if not hasattr(event, '_lock') or not event._lock:
|
||||
rc = get_redis_connection("redis")
|
||||
event._lock = Lock(redis=rc, name='pretix_event_%s' % event.id, timeout=LOCK_TIMEOUT)
|
||||
return event._lock
|
||||
|
||||
|
||||
def lock_event_redis(event):
|
||||
from redis.exceptions import RedisError
|
||||
|
||||
lock = redis_lock_from_event(event)
|
||||
retries = 5
|
||||
for i in range(retries):
|
||||
try:
|
||||
if lock.acquire(blocking=False):
|
||||
return True
|
||||
except RedisError:
|
||||
logger.exception('Error locking an event')
|
||||
raise LockTimeoutException()
|
||||
time.sleep(2 ** i / 100)
|
||||
raise LockTimeoutException()
|
||||
|
||||
|
||||
def release_event_redis(event):
|
||||
from redis import RedisError
|
||||
|
||||
lock = redis_lock_from_event(event)
|
||||
try:
|
||||
lock.release()
|
||||
except RedisError:
|
||||
logger.exception('Error releasing an event lock')
|
||||
raise LockReleaseException()
|
||||
event._lock = None
|
||||
|
||||
@@ -1,72 +0,0 @@
|
||||
#
|
||||
# This file is part of pretix (Community Edition).
|
||||
#
|
||||
# Copyright (C) 2014-2020 Raphael Michel and contributors
|
||||
# Copyright (C) 2020-2021 rami.io GmbH and contributors
|
||||
#
|
||||
# This program is free software: you can redistribute it and/or modify it under the terms of the GNU Affero General
|
||||
# Public License as published by the Free Software Foundation in version 3 of the License.
|
||||
#
|
||||
# ADDITIONAL TERMS APPLY: Pursuant to Section 7 of the GNU Affero General Public License, additional terms are
|
||||
# applicable granting you additional permissions and placing additional restrictions on your usage of this software.
|
||||
# Please refer to the pretix LICENSE file to obtain the full terms applicable to this work. If you did not receive
|
||||
# this file, see <https://pretix.eu/about/en/license>.
|
||||
#
|
||||
# This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied
|
||||
# warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Affero General Public License for more
|
||||
# details.
|
||||
#
|
||||
# You should have received a copy of the GNU Affero General Public License along with this program. If not, see
|
||||
# <https://www.gnu.org/licenses/>.
|
||||
#
|
||||
import secrets
|
||||
|
||||
from django.db import IntegrityError
|
||||
from django.db.models import Q
|
||||
from django_scopes import scopes_disabled
|
||||
|
||||
from pretix.base.models import GiftCardAcceptance
|
||||
from pretix.base.models.media import MediumKeySet
|
||||
|
||||
|
||||
def create_nfc_mf0aes_keyset(organizer):
|
||||
for i in range(20):
|
||||
public_id = secrets.randbelow(2 ** 32)
|
||||
uid_key = secrets.token_bytes(16)
|
||||
diversification_key = secrets.token_bytes(16)
|
||||
try:
|
||||
return MediumKeySet.objects.create(
|
||||
organizer=organizer,
|
||||
media_type="nfc_mf0aes",
|
||||
public_id=public_id,
|
||||
diversification_key=diversification_key,
|
||||
uid_key=uid_key,
|
||||
active=True,
|
||||
)
|
||||
except IntegrityError: # either race condition with another thread or duplicate public ID
|
||||
try:
|
||||
return MediumKeySet.objects.get(
|
||||
organizer=organizer,
|
||||
media_type="nfc_mf0aes",
|
||||
active=True,
|
||||
)
|
||||
except MediumKeySet.DoesNotExist:
|
||||
continue # duplicate public ID, let's try again
|
||||
|
||||
|
||||
@scopes_disabled()
|
||||
def get_keysets_for_organizer(organizer):
|
||||
sets = list(MediumKeySet.objects.filter(
|
||||
Q(organizer=organizer) | Q(organizer__in=GiftCardAcceptance.objects.filter(
|
||||
acceptor=organizer,
|
||||
active=True,
|
||||
reusable_media=True,
|
||||
).values_list("issuer_id", flat=True))
|
||||
))
|
||||
if organizer.settings.reusable_media_type_nfc_mf0aes and not any(
|
||||
ks.organizer == organizer and ks.media_type == "nfc_mf0aes" for ks in sets
|
||||
):
|
||||
new_set = create_nfc_mf0aes_keyset(organizer)
|
||||
if new_set:
|
||||
sets.append(new_set)
|
||||
return sets
|
||||
@@ -36,7 +36,7 @@ from pretix.base.models import (
|
||||
from pretix.base.models.orders import Transaction
|
||||
from pretix.base.orderimport import get_all_columns
|
||||
from pretix.base.services.invoices import generate_invoice, invoice_qualified
|
||||
from pretix.base.services.locking import lock_objects
|
||||
from pretix.base.services.locking import NoLockManager
|
||||
from pretix.base.services.tasks import ProfiledEventTask
|
||||
from pretix.base.signals import order_paid, order_placed
|
||||
from pretix.celery_app import app
|
||||
@@ -88,6 +88,7 @@ def setif(record, obj, attr, setting):
|
||||
def import_orders(event: Event, fileid: str, settings: dict, locale: str, user) -> None:
|
||||
cf = CachedFile.objects.get(id=fileid)
|
||||
user = User.objects.get(pk=user)
|
||||
seats_used = False
|
||||
with language(locale, event.settings.region):
|
||||
cols = get_all_columns(event)
|
||||
parsed = parse_csv(cf.file)
|
||||
@@ -117,7 +118,6 @@ def import_orders(event: Event, fileid: str, settings: dict, locale: str, user)
|
||||
|
||||
# Prepare model objects. Yes, this might consume lots of RAM, but allows us to make the actual SQL transaction
|
||||
# shorter. We'll see what works better in reality…
|
||||
lock_seats = []
|
||||
for i, record in enumerate(data):
|
||||
try:
|
||||
if order is None or settings['orders'] == 'many':
|
||||
@@ -135,7 +135,7 @@ def import_orders(event: Event, fileid: str, settings: dict, locale: str, user)
|
||||
position.attendee_name_parts = {'_scheme': event.settings.name_scheme}
|
||||
position.meta_info = {}
|
||||
if position.seat is not None:
|
||||
lock_seats.append(position.seat)
|
||||
seats_used = True
|
||||
order._positions.append(position)
|
||||
position.assign_pseudonymization_id()
|
||||
|
||||
@@ -147,15 +147,12 @@ def import_orders(event: Event, fileid: str, settings: dict, locale: str, user)
|
||||
_('Invalid data in row {row}: {message}').format(row=i, message=str(e))
|
||||
)
|
||||
|
||||
try:
|
||||
with transaction.atomic():
|
||||
# We don't support vouchers, quotas, or memberships here, so we only need to lock if seats are in use
|
||||
if lock_seats:
|
||||
lock_objects(lock_seats, shared_lock_objects=[event])
|
||||
for s in lock_seats:
|
||||
if not s.is_available():
|
||||
raise ImportError(_('The seat you selected has already been taken. Please select a different seat.'))
|
||||
# We don't support vouchers, quotas, or memberships here, so we only need to lock if seats
|
||||
# are in use
|
||||
lockfn = event.lock if seats_used else NoLockManager
|
||||
|
||||
try:
|
||||
with lockfn(), transaction.atomic():
|
||||
save_transactions = []
|
||||
for o in orders:
|
||||
o.total = sum([c.price for c in o._positions]) # currently no support for fees
|
||||
|
||||
@@ -35,13 +35,10 @@
|
||||
|
||||
import json
|
||||
import logging
|
||||
import operator
|
||||
import sys
|
||||
from collections import Counter, defaultdict, namedtuple
|
||||
from datetime import datetime, time, timedelta
|
||||
from decimal import Decimal
|
||||
from functools import reduce
|
||||
from time import sleep
|
||||
from typing import List, Optional
|
||||
|
||||
from celery.exceptions import MaxRetriesExceededError
|
||||
@@ -85,9 +82,7 @@ from pretix.base.services import tickets
|
||||
from pretix.base.services.invoices import (
|
||||
generate_cancellation, generate_invoice, invoice_qualified,
|
||||
)
|
||||
from pretix.base.services.locking import (
|
||||
LOCK_TRUST_WINDOW, LockTimeoutException, lock_objects,
|
||||
)
|
||||
from pretix.base.services.locking import LockTimeoutException, NoLockManager
|
||||
from pretix.base.services.mail import SendMailException
|
||||
from pretix.base.services.memberships import (
|
||||
create_membership, validate_memberships_in_order,
|
||||
@@ -107,7 +102,6 @@ from pretix.celery_app import app
|
||||
from pretix.helpers import OF_SELF
|
||||
from pretix.helpers.models import modelcopy
|
||||
from pretix.helpers.periodic import minimum_interval
|
||||
from pretix.testutils.middleware import debugflags_var
|
||||
|
||||
|
||||
class OrderError(Exception):
|
||||
@@ -215,9 +209,9 @@ def reactivate_order(order: Order, force: bool=False, user: User=None, auth=None
|
||||
if order.status != Order.STATUS_CANCELED:
|
||||
raise OrderError(_('The order was not canceled.'))
|
||||
|
||||
with transaction.atomic():
|
||||
is_available = order._is_still_available(now(), count_waitinglist=False, check_voucher_usage=True,
|
||||
check_memberships=True, lock=True, force=force)
|
||||
with order.event.lock() as now_dt:
|
||||
is_available = order._is_still_available(now_dt, count_waitinglist=False, check_voucher_usage=True,
|
||||
check_memberships=True, force=force)
|
||||
if is_available is True:
|
||||
if order.payment_refund_sum >= order.total:
|
||||
order.status = Order.STATUS_PAID
|
||||
@@ -226,28 +220,29 @@ def reactivate_order(order: Order, force: bool=False, user: User=None, auth=None
|
||||
order.cancellation_date = None
|
||||
order.set_expires(now(),
|
||||
order.event.subevents.filter(id__in=[p.subevent_id for p in order.positions.all()]))
|
||||
order.save(update_fields=['expires', 'status', 'cancellation_date'])
|
||||
order.log_action(
|
||||
'pretix.event.order.reactivated',
|
||||
user=user,
|
||||
auth=auth,
|
||||
data={
|
||||
'expires': order.expires,
|
||||
}
|
||||
)
|
||||
for position in order.positions.all():
|
||||
if position.voucher:
|
||||
Voucher.objects.filter(pk=position.voucher.pk).update(redeemed=Greatest(0, F('redeemed') + 1))
|
||||
with transaction.atomic():
|
||||
order.save(update_fields=['expires', 'status', 'cancellation_date'])
|
||||
order.log_action(
|
||||
'pretix.event.order.reactivated',
|
||||
user=user,
|
||||
auth=auth,
|
||||
data={
|
||||
'expires': order.expires,
|
||||
}
|
||||
)
|
||||
for position in order.positions.all():
|
||||
if position.voucher:
|
||||
Voucher.objects.filter(pk=position.voucher.pk).update(redeemed=Greatest(0, F('redeemed') + 1))
|
||||
|
||||
for gc in position.issued_gift_cards.all():
|
||||
gc = GiftCard.objects.select_for_update(of=OF_SELF).get(pk=gc.pk)
|
||||
gc.transactions.create(value=position.price, order=order, acceptor=order.event.organizer)
|
||||
break
|
||||
for gc in position.issued_gift_cards.all():
|
||||
gc = GiftCard.objects.select_for_update(of=OF_SELF).get(pk=gc.pk)
|
||||
gc.transactions.create(value=position.price, order=order, acceptor=order.event.organizer)
|
||||
break
|
||||
|
||||
for m in position.granted_memberships.all():
|
||||
m.canceled = False
|
||||
m.save()
|
||||
order.create_transactions()
|
||||
for m in position.granted_memberships.all():
|
||||
m.canceled = False
|
||||
m.save()
|
||||
order.create_transactions()
|
||||
else:
|
||||
raise OrderError(is_available)
|
||||
|
||||
@@ -269,6 +264,7 @@ def extend_order(order: Order, new_date: datetime, force: bool=False, valid_if_p
|
||||
if new_date < now():
|
||||
raise OrderError(_('The new expiry date needs to be in the future.'))
|
||||
|
||||
@transaction.atomic
|
||||
def change(was_expired=True):
|
||||
old_date = order.expires
|
||||
order.expires = new_date
|
||||
@@ -306,11 +302,11 @@ def extend_order(order: Order, new_date: datetime, force: bool=False, valid_if_p
|
||||
generate_invoice(order)
|
||||
order.create_transactions()
|
||||
|
||||
with transaction.atomic():
|
||||
if order.status == Order.STATUS_PENDING:
|
||||
change(was_expired=False)
|
||||
else:
|
||||
is_available = order._is_still_available(now(), count_waitinglist=False, lock=True, force=force)
|
||||
if order.status == Order.STATUS_PENDING:
|
||||
change(was_expired=False)
|
||||
else:
|
||||
with order.event.lock() as now_dt:
|
||||
is_available = order._is_still_available(now_dt, count_waitinglist=False, force=force)
|
||||
if is_available is True:
|
||||
change(was_expired=True)
|
||||
else:
|
||||
@@ -338,8 +334,9 @@ def mark_order_expired(order, user=None, auth=None):
|
||||
order = Order.objects.get(pk=order)
|
||||
if isinstance(user, int):
|
||||
user = User.objects.get(pk=user)
|
||||
order.status = Order.STATUS_EXPIRED
|
||||
order.save(update_fields=['status'])
|
||||
with order.event.lock():
|
||||
order.status = Order.STATUS_EXPIRED
|
||||
order.save(update_fields=['status'])
|
||||
|
||||
order.log_action('pretix.event.order.expired', user=user, auth=auth)
|
||||
i = order.invoices.filter(is_cancellation=False).last()
|
||||
@@ -442,8 +439,9 @@ def deny_order(order, comment='', user=None, send_mail: bool=True, auth=None):
|
||||
if not order.require_approval or not order.status == Order.STATUS_PENDING:
|
||||
raise OrderError(_('This order is not pending approval.'))
|
||||
|
||||
order.status = Order.STATUS_CANCELED
|
||||
order.save(update_fields=['status'])
|
||||
with order.event.lock():
|
||||
order.status = Order.STATUS_CANCELED
|
||||
order.save(update_fields=['status'])
|
||||
|
||||
order.log_action('pretix.event.order.denied', user=user, auth=auth, data={
|
||||
'comment': comment
|
||||
@@ -523,49 +521,51 @@ def _cancel_order(order, user=None, send_mail: bool=True, api_token=None, device
|
||||
m.save()
|
||||
|
||||
if cancellation_fee:
|
||||
for position in order.positions.all():
|
||||
if position.voucher:
|
||||
Voucher.objects.filter(pk=position.voucher.pk).update(redeemed=Greatest(0, F('redeemed') - 1))
|
||||
position.canceled = True
|
||||
assign_ticket_secret(
|
||||
event=order.event, position=position, force_invalidate_if_revokation_list_used=True, force_invalidate=False, save=False
|
||||
)
|
||||
position.save(update_fields=['canceled', 'secret'])
|
||||
new_fee = cancellation_fee
|
||||
for fee in order.fees.all():
|
||||
if keep_fees and fee in keep_fees:
|
||||
new_fee -= fee.value
|
||||
with order.event.lock():
|
||||
for position in order.positions.all():
|
||||
if position.voucher:
|
||||
Voucher.objects.filter(pk=position.voucher.pk).update(redeemed=Greatest(0, F('redeemed') - 1))
|
||||
position.canceled = True
|
||||
assign_ticket_secret(
|
||||
event=order.event, position=position, force_invalidate_if_revokation_list_used=True, force_invalidate=False, save=False
|
||||
)
|
||||
position.save(update_fields=['canceled', 'secret'])
|
||||
new_fee = cancellation_fee
|
||||
for fee in order.fees.all():
|
||||
if keep_fees and fee in keep_fees:
|
||||
new_fee -= fee.value
|
||||
else:
|
||||
fee.canceled = True
|
||||
fee.save(update_fields=['canceled'])
|
||||
|
||||
if new_fee:
|
||||
f = OrderFee(
|
||||
fee_type=OrderFee.FEE_TYPE_CANCELLATION,
|
||||
value=new_fee,
|
||||
tax_rule=order.event.settings.tax_rate_default,
|
||||
order=order,
|
||||
)
|
||||
f._calculate_tax()
|
||||
f.save()
|
||||
|
||||
if cancellation_fee > order.total:
|
||||
raise OrderError(_('The cancellation fee cannot be higher than the total amount of this order.'))
|
||||
elif order.payment_refund_sum < cancellation_fee:
|
||||
order.status = Order.STATUS_PENDING
|
||||
order.set_expires()
|
||||
else:
|
||||
fee.canceled = True
|
||||
fee.save(update_fields=['canceled'])
|
||||
|
||||
if new_fee:
|
||||
f = OrderFee(
|
||||
fee_type=OrderFee.FEE_TYPE_CANCELLATION,
|
||||
value=new_fee,
|
||||
tax_rule=order.event.settings.tax_rate_default,
|
||||
order=order,
|
||||
)
|
||||
f._calculate_tax()
|
||||
f.save()
|
||||
|
||||
if cancellation_fee > order.total:
|
||||
raise OrderError(_('The cancellation fee cannot be higher than the total amount of this order.'))
|
||||
elif order.payment_refund_sum < cancellation_fee:
|
||||
order.status = Order.STATUS_PENDING
|
||||
order.set_expires()
|
||||
else:
|
||||
order.status = Order.STATUS_PAID
|
||||
order.total = cancellation_fee
|
||||
order.cancellation_date = now()
|
||||
order.save(update_fields=['status', 'cancellation_date', 'total'])
|
||||
order.status = Order.STATUS_PAID
|
||||
order.total = cancellation_fee
|
||||
order.cancellation_date = now()
|
||||
order.save(update_fields=['status', 'cancellation_date', 'total'])
|
||||
|
||||
if cancel_invoice and i:
|
||||
invoices.append(generate_invoice(order))
|
||||
else:
|
||||
order.status = Order.STATUS_CANCELED
|
||||
order.cancellation_date = now()
|
||||
order.save(update_fields=['status', 'cancellation_date'])
|
||||
with order.event.lock():
|
||||
order.status = Order.STATUS_CANCELED
|
||||
order.cancellation_date = now()
|
||||
order.save(update_fields=['status', 'cancellation_date'])
|
||||
|
||||
for position in order.positions.all():
|
||||
assign_ticket_secret(
|
||||
@@ -666,27 +666,6 @@ def _check_positions(event: Event, now_dt: datetime, positions: List[CartPositio
|
||||
|
||||
sorted_positions = sorted(positions, key=lambda s: -int(s.is_bundled))
|
||||
|
||||
for cp in sorted_positions:
|
||||
cp._cached_quotas = list(cp.quotas)
|
||||
|
||||
# Create locks
|
||||
if any(cp.expires < now() + timedelta(seconds=LOCK_TRUST_WINDOW) for cp in sorted_positions):
|
||||
# No need to perform any locking if the cart positions still guarantee everything long enough.
|
||||
full_lock_required = any(
|
||||
getattr(o, 'seat', False) for o in sorted_positions
|
||||
) and event.settings.seating_minimal_distance > 0
|
||||
if full_lock_required:
|
||||
# We lock the entire event in this case since we don't want to deal with fine-granular locking
|
||||
# in the case of seating distance enforcement
|
||||
lock_objects([event])
|
||||
else:
|
||||
lock_objects(
|
||||
[q for q in reduce(operator.or_, (set(cp._cached_quotas) for cp in sorted_positions), set()) if q.size is not None] +
|
||||
[op.voucher for op in sorted_positions if op.voucher] +
|
||||
[op.seat for op in sorted_positions if op.seat],
|
||||
shared_lock_objects=[event]
|
||||
)
|
||||
|
||||
# Check availability
|
||||
for i, cp in enumerate(sorted_positions):
|
||||
if cp.pk in deleted_positions:
|
||||
@@ -696,7 +675,7 @@ def _check_positions(event: Event, now_dt: datetime, positions: List[CartPositio
|
||||
err = err or error_messages['unavailable']
|
||||
delete(cp)
|
||||
continue
|
||||
quotas = cp._cached_quotas
|
||||
quotas = list(cp.quotas)
|
||||
|
||||
products_seen[cp.item] += 1
|
||||
if cp.item.max_per_order and products_seen[cp.item] > cp.item.max_per_order:
|
||||
@@ -710,7 +689,6 @@ def _check_positions(event: Event, now_dt: datetime, positions: List[CartPositio
|
||||
if cp.voucher:
|
||||
v_usages[cp.voucher] += 1
|
||||
if cp.voucher not in v_avail:
|
||||
cp.voucher.refresh_from_db(fields=['redeemed'])
|
||||
redeemed_in_carts = CartPosition.objects.filter(
|
||||
Q(voucher=cp.voucher) & Q(event=event) & Q(expires__gte=now_dt)
|
||||
).exclude(cart_id=cp.cart_id)
|
||||
@@ -949,87 +927,91 @@ def _create_order(event: Event, email: str, positions: List[CartPosition], now_d
|
||||
payments = []
|
||||
sales_channel = get_all_sales_channels()[sales_channel]
|
||||
|
||||
try:
|
||||
validate_memberships_in_order(customer, positions, event, lock=True, testmode=event.testmode)
|
||||
except ValidationError as e:
|
||||
raise OrderError(e.message)
|
||||
with transaction.atomic():
|
||||
|
||||
require_approval = any(p.requires_approval(invoice_address=address) for p in positions)
|
||||
try:
|
||||
fees = _get_fees(positions, payment_requests, address, meta_info, event, require_approval=require_approval)
|
||||
except TaxRule.SaleNotAllowed:
|
||||
raise OrderError(error_messages['country_blocked'])
|
||||
total = pending_sum = sum([c.price for c in positions]) + sum([c.value for c in fees])
|
||||
|
||||
order = Order(
|
||||
status=Order.STATUS_PENDING,
|
||||
event=event,
|
||||
email=email,
|
||||
phone=(meta_info or {}).get('contact_form_data', {}).get('phone'),
|
||||
datetime=now_dt,
|
||||
locale=get_language_without_region(locale),
|
||||
total=total,
|
||||
testmode=True if sales_channel.testmode_supported and event.testmode else False,
|
||||
meta_info=json.dumps(meta_info or {}),
|
||||
require_approval=require_approval,
|
||||
sales_channel=sales_channel.identifier,
|
||||
customer=customer,
|
||||
valid_if_pending=valid_if_pending,
|
||||
)
|
||||
if customer:
|
||||
order.email_known_to_work = customer.is_verified
|
||||
order.set_expires(now_dt, event.subevents.filter(id__in=[p.subevent_id for p in positions]))
|
||||
order.save()
|
||||
|
||||
if address:
|
||||
if address.order is not None:
|
||||
address.pk = None
|
||||
address.order = order
|
||||
address.save()
|
||||
|
||||
for fee in fees:
|
||||
fee.order = order
|
||||
try:
|
||||
fee._calculate_tax()
|
||||
validate_memberships_in_order(customer, positions, event, lock=True, testmode=event.testmode)
|
||||
except ValidationError as e:
|
||||
raise OrderError(e.message)
|
||||
|
||||
require_approval = any(p.requires_approval(invoice_address=address) for p in positions)
|
||||
try:
|
||||
fees = _get_fees(positions, payment_requests, address, meta_info, event, require_approval=require_approval)
|
||||
except TaxRule.SaleNotAllowed:
|
||||
raise OrderError(error_messages['country_blocked'])
|
||||
if fee.tax_rule and not fee.tax_rule.pk:
|
||||
fee.tax_rule = None # TODO: deprecate
|
||||
fee.save()
|
||||
total = pending_sum = sum([c.price for c in positions]) + sum([c.value for c in fees])
|
||||
|
||||
# Safety check: Is the amount we're now going to charge the same amount the user has been shown when they
|
||||
# pressed "Confirm purchase"? If not, we should better warn the user and show the confirmation page again.
|
||||
# We used to have a *known* case where this happened is if a gift card is used in two concurrent sessions,
|
||||
# but this is now a payment error instead. So currently this code branch is usually only triggered by bugs
|
||||
# in other places (e.g. tax calculation).
|
||||
if shown_total is not None:
|
||||
if Decimal(shown_total) != pending_sum:
|
||||
raise OrderError(
|
||||
_('While trying to place your order, we noticed that the order total has changed. Either one of '
|
||||
'the prices changed just now, or a gift card you used has been used in the meantime. Please '
|
||||
'check the prices below and try again.')
|
||||
)
|
||||
order = Order(
|
||||
status=Order.STATUS_PENDING,
|
||||
event=event,
|
||||
email=email,
|
||||
phone=(meta_info or {}).get('contact_form_data', {}).get('phone'),
|
||||
datetime=now_dt,
|
||||
locale=get_language_without_region(locale),
|
||||
total=total,
|
||||
testmode=True if sales_channel.testmode_supported and event.testmode else False,
|
||||
meta_info=json.dumps(meta_info or {}),
|
||||
require_approval=require_approval,
|
||||
sales_channel=sales_channel.identifier,
|
||||
customer=customer,
|
||||
valid_if_pending=valid_if_pending,
|
||||
)
|
||||
if customer:
|
||||
order.email_known_to_work = customer.is_verified
|
||||
order.set_expires(now_dt, event.subevents.filter(id__in=[p.subevent_id for p in positions]))
|
||||
order.save()
|
||||
|
||||
if payment_requests and not order.require_approval:
|
||||
for p in payment_requests:
|
||||
if not p.get('multi_use_supported') or p['payment_amount'] > Decimal('0.00'):
|
||||
payments.append(order.payments.create(
|
||||
state=OrderPayment.PAYMENT_STATE_CREATED,
|
||||
provider=p['provider'],
|
||||
amount=p['payment_amount'],
|
||||
fee=p.get('fee'),
|
||||
info=json.dumps(p['info_data']),
|
||||
process_initiated=False,
|
||||
))
|
||||
if address:
|
||||
if address.order is not None:
|
||||
address.pk = None
|
||||
address.order = order
|
||||
address.save()
|
||||
|
||||
orderpositions = OrderPosition.transform_cart_positions(positions, order)
|
||||
order.create_transactions(positions=orderpositions, fees=fees, is_new=True)
|
||||
order.log_action('pretix.event.order.placed')
|
||||
if order.require_approval:
|
||||
order.log_action('pretix.event.order.placed.require_approval')
|
||||
if meta_info:
|
||||
for msg in meta_info.get('confirm_messages', []):
|
||||
order.log_action('pretix.event.order.consent', data={'msg': msg})
|
||||
order.save()
|
||||
|
||||
for fee in fees:
|
||||
fee.order = order
|
||||
try:
|
||||
fee._calculate_tax()
|
||||
except TaxRule.SaleNotAllowed:
|
||||
raise OrderError(error_messages['country_blocked'])
|
||||
if fee.tax_rule and not fee.tax_rule.pk:
|
||||
fee.tax_rule = None # TODO: deprecate
|
||||
fee.save()
|
||||
|
||||
# Safety check: Is the amount we're now going to charge the same amount the user has been shown when they
|
||||
# pressed "Confirm purchase"? If not, we should better warn the user and show the confirmation page again.
|
||||
# We used to have a *known* case where this happened is if a gift card is used in two concurrent sessions,
|
||||
# but this is now a payment error instead. So currently this code branch is usually only triggered by bugs
|
||||
# in other places (e.g. tax calculation).
|
||||
if shown_total is not None:
|
||||
if Decimal(shown_total) != pending_sum:
|
||||
raise OrderError(
|
||||
_('While trying to place your order, we noticed that the order total has changed. Either one of '
|
||||
'the prices changed just now, or a gift card you used has been used in the meantime. Please '
|
||||
'check the prices below and try again.')
|
||||
)
|
||||
|
||||
if payment_requests and not order.require_approval:
|
||||
for p in payment_requests:
|
||||
if not p.get('multi_use_supported') or p['payment_amount'] > Decimal('0.00'):
|
||||
payments.append(order.payments.create(
|
||||
state=OrderPayment.PAYMENT_STATE_CREATED,
|
||||
provider=p['provider'],
|
||||
amount=p['payment_amount'],
|
||||
fee=p.get('fee'),
|
||||
info=json.dumps(p['info_data']),
|
||||
process_initiated=False,
|
||||
))
|
||||
|
||||
orderpositions = OrderPosition.transform_cart_positions(positions, order)
|
||||
order.create_transactions(positions=orderpositions, fees=fees, is_new=True)
|
||||
order.log_action('pretix.event.order.placed')
|
||||
if order.require_approval:
|
||||
order.log_action('pretix.event.order.placed.require_approval')
|
||||
if meta_info:
|
||||
for msg in meta_info.get('confirm_messages', []):
|
||||
order.log_action('pretix.event.order.consent', data={'msg': msg})
|
||||
|
||||
order_placed.send(event, order=order)
|
||||
return order, payments
|
||||
@@ -1134,12 +1116,18 @@ def _perform_order(event: Event, payment_requests: List[dict], position_ids: Lis
|
||||
if result:
|
||||
valid_if_pending = True
|
||||
|
||||
lockfn = NoLockManager
|
||||
locked = False
|
||||
if positions.filter(Q(voucher__isnull=False) | Q(expires__lt=now() + timedelta(minutes=2)) | Q(seat__isnull=False)).exists():
|
||||
# Performance optimization: If no voucher is used and no cart position is dangerously close to its expiry date,
|
||||
# creating this order shouldn't be prone to any race conditions and we don't need to lock the event.
|
||||
locked = True
|
||||
lockfn = event.lock
|
||||
|
||||
warnings = []
|
||||
any_payment_failed = False
|
||||
|
||||
now_dt = now()
|
||||
err_out = None
|
||||
with transaction.atomic(durable=True):
|
||||
with lockfn() as now_dt:
|
||||
positions = list(
|
||||
positions.select_related('item', 'variation', 'subevent', 'seat', 'addon_to').prefetch_related('addons')
|
||||
)
|
||||
@@ -1148,28 +1136,16 @@ def _perform_order(event: Event, payment_requests: List[dict], position_ids: Lis
|
||||
raise OrderError(error_messages['empty'])
|
||||
if len(position_ids) != len(positions):
|
||||
raise OrderError(error_messages['internal'])
|
||||
_check_positions(event, now_dt, positions, address=addr, sales_channel=sales_channel, customer=customer)
|
||||
order, payment_objs = _create_order(event, email, positions, now_dt, payment_requests,
|
||||
locale=locale, address=addr, meta_info=meta_info, sales_channel=sales_channel,
|
||||
shown_total=shown_total, customer=customer, valid_if_pending=valid_if_pending)
|
||||
try:
|
||||
_check_positions(event, now_dt, positions, address=addr, sales_channel=sales_channel, customer=customer)
|
||||
except OrderError as e:
|
||||
err_out = e # Don't raise directly to make sure transaction is committed, as it might have deleted things
|
||||
else:
|
||||
if 'sleep-after-quota-check' in debugflags_var.get():
|
||||
sleep(2)
|
||||
|
||||
order, payment_objs = _create_order(event, email, positions, now_dt, payment_requests,
|
||||
locale=locale, address=addr, meta_info=meta_info, sales_channel=sales_channel,
|
||||
shown_total=shown_total, customer=customer, valid_if_pending=valid_if_pending)
|
||||
|
||||
try:
|
||||
for p in payment_objs:
|
||||
if p.provider == 'free':
|
||||
# Passing lock=False is safe here because it's absolutely impossible for the order to be expired
|
||||
# here before it is even committed.
|
||||
p.confirm(send_mail=False, lock=False, generate_invoice=False)
|
||||
except Quota.QuotaExceededException:
|
||||
pass
|
||||
if err_out:
|
||||
raise err_out
|
||||
for p in payment_objs:
|
||||
if p.provider == 'free':
|
||||
p.confirm(send_mail=False, lock=not locked, generate_invoice=False)
|
||||
except Quota.QuotaExceededException:
|
||||
pass
|
||||
|
||||
# We give special treatment to GiftCardPayment here because our invoice renderer expects gift cards to already be
|
||||
# processed, and because we historically treat gift card orders like free orders with regards to email texts.
|
||||
@@ -1294,12 +1270,12 @@ def expire_orders(sender, **kwargs):
|
||||
Exists(
|
||||
OrderFee.objects.filter(order_id=OuterRef('pk'), fee_type=OrderFee.FEE_TYPE_CANCELLATION)
|
||||
)
|
||||
).prefetch_related('event').order_by('event_id')
|
||||
).select_related('event').order_by('event_id')
|
||||
for o in qs:
|
||||
if o.event_id != event_id:
|
||||
expire = o.event.settings.get('payment_term_expire_automatically', as_type=bool)
|
||||
event_id = o.event_id
|
||||
if expire and now() >= o.payment_term_expire_date:
|
||||
if expire:
|
||||
mark_order_expired(o)
|
||||
|
||||
|
||||
@@ -1313,19 +1289,9 @@ def send_expiry_warnings(sender, **kwargs):
|
||||
event_id = None
|
||||
|
||||
for o in Order.objects.filter(
|
||||
expires__gte=today, expiry_reminder_sent=False, status=Order.STATUS_PENDING,
|
||||
datetime__lte=now() - timedelta(hours=2), require_approval=False
|
||||
expires__gte=today, expiry_reminder_sent=False, status=Order.STATUS_PENDING,
|
||||
datetime__lte=now() - timedelta(hours=2), require_approval=False
|
||||
).only('pk', 'event_id', 'expires').order_by('event_id'):
|
||||
|
||||
lp = o.payments.last()
|
||||
if (
|
||||
lp and
|
||||
lp.state in [OrderPayment.PAYMENT_STATE_CREATED, OrderPayment.PAYMENT_STATE_PENDING] and
|
||||
lp.payment_provider and
|
||||
lp.payment_provider.prevent_reminder_mail(o, lp)
|
||||
):
|
||||
continue
|
||||
|
||||
if event_id != o.event_id:
|
||||
settings = o.event.settings
|
||||
days = cache.get_or_set('{}:{}:setting_mail_days_order_expire_warning'.format('event', o.event_id),
|
||||
@@ -2510,11 +2476,6 @@ class OrderChangeManager:
|
||||
split_order.status = Order.STATUS_PAID
|
||||
else:
|
||||
split_order.status = Order.STATUS_PENDING
|
||||
if self.order.status == Order.STATUS_PAID:
|
||||
split_order.set_expires(
|
||||
now(),
|
||||
list(set(p.subevent_id for p in split_positions))
|
||||
)
|
||||
split_order.save()
|
||||
|
||||
if offset_amount > Decimal('0.00'):
|
||||
@@ -2682,19 +2643,6 @@ class OrderChangeManager:
|
||||
except ValidationError as e:
|
||||
raise OrderError(e.message)
|
||||
|
||||
def _create_locks(self):
|
||||
full_lock_required = any(diff > 0 for diff in self._seatdiff.values()) and self.event.settings.seating_minimal_distance > 0
|
||||
if full_lock_required:
|
||||
# We lock the entire event in this case since we don't want to deal with fine-granular locking
|
||||
# in the case of seating distance enforcement
|
||||
lock_objects([self.event])
|
||||
else:
|
||||
lock_objects(
|
||||
[q for q, d in self._quotadiff.items() if q.size is not None and d > 0] +
|
||||
[s for s, d in self._seatdiff.items() if d > 0],
|
||||
shared_lock_objects=[self.event]
|
||||
)
|
||||
|
||||
def commit(self, check_quotas=True):
|
||||
if self._committed:
|
||||
# an order change can only be committed once
|
||||
@@ -2714,17 +2662,17 @@ class OrderChangeManager:
|
||||
self._payment_fee_diff()
|
||||
|
||||
with transaction.atomic():
|
||||
if self.order.status in (Order.STATUS_PENDING, Order.STATUS_PAID):
|
||||
if check_quotas:
|
||||
self._check_quotas()
|
||||
self._check_seats()
|
||||
self._create_locks()
|
||||
self._check_complete_cancel()
|
||||
self._check_and_lock_memberships()
|
||||
try:
|
||||
self._perform_operations()
|
||||
except TaxRule.SaleNotAllowed:
|
||||
raise OrderError(self.error_messages['tax_rule_country_blocked'])
|
||||
with self.order.event.lock():
|
||||
if self.order.status in (Order.STATUS_PENDING, Order.STATUS_PAID):
|
||||
if check_quotas:
|
||||
self._check_quotas()
|
||||
self._check_seats()
|
||||
self._check_complete_cancel()
|
||||
self._check_and_lock_memberships()
|
||||
try:
|
||||
self._perform_operations()
|
||||
except TaxRule.SaleNotAllowed:
|
||||
raise OrderError(self.error_messages['tax_rule_country_blocked'])
|
||||
self._recalculate_total_and_payment_fee()
|
||||
self._check_paid_price_change()
|
||||
self._check_paid_to_free()
|
||||
|
||||
@@ -171,7 +171,7 @@ def apply_discounts(event: Event, sales_channel: str,
|
||||
Q(available_until__isnull=True) | Q(available_until__gte=now()),
|
||||
sales_channels__contains=sales_channel,
|
||||
active=True,
|
||||
).prefetch_related('condition_limit_products', 'benefit_limit_products').order_by('position', 'pk')
|
||||
).prefetch_related('condition_limit_products').order_by('position', 'pk')
|
||||
for discount in discount_qs:
|
||||
result = discount.apply({
|
||||
idx: (item_id, subevent_id, line_price_gross, is_addon_to, voucher_discount)
|
||||
|
||||
@@ -24,13 +24,13 @@ import time
|
||||
from collections import Counter, defaultdict
|
||||
from itertools import zip_longest
|
||||
|
||||
import django_redis
|
||||
from django.conf import settings
|
||||
from django.db import models
|
||||
from django.db.models import (
|
||||
Case, Count, F, Func, Max, OuterRef, Q, Subquery, Sum, Value, When,
|
||||
)
|
||||
from django.utils.timezone import now
|
||||
from django_redis import get_redis_connection
|
||||
|
||||
from pretix.base.models import (
|
||||
CartPosition, Checkin, Order, OrderPosition, Quota, Voucher,
|
||||
@@ -102,12 +102,6 @@ class QuotaAvailability:
|
||||
self.count_waitinglist = defaultdict(int)
|
||||
self.count_cart = defaultdict(int)
|
||||
|
||||
self._cache_key_suffix = ""
|
||||
if not self._count_waitinglist:
|
||||
self._cache_key_suffix += ":nocw"
|
||||
if self._ignore_closed:
|
||||
self._cache_key_suffix += ":igcl"
|
||||
|
||||
self.sizes = {}
|
||||
|
||||
def queue(self, *quota):
|
||||
@@ -127,14 +121,17 @@ class QuotaAvailability:
|
||||
if self._full_results:
|
||||
raise ValueError("You cannot combine full_results and allow_cache.")
|
||||
|
||||
elif not self._count_waitinglist:
|
||||
raise ValueError("If you set allow_cache, you need to set count_waitinglist.")
|
||||
|
||||
elif settings.HAS_REDIS:
|
||||
rc = django_redis.get_redis_connection("redis")
|
||||
rc = get_redis_connection("redis")
|
||||
quotas_by_event = defaultdict(list)
|
||||
for q in [_q for _q in self._queue if _q.id in quota_ids_set]:
|
||||
quotas_by_event[q.event_id].append(q)
|
||||
|
||||
for eventid, evquotas in quotas_by_event.items():
|
||||
d = rc.hmget(f'quotas:{eventid}:availabilitycache{self._cache_key_suffix}', [str(q.pk) for q in evquotas])
|
||||
d = rc.hmget(f'quotas:{eventid}:availabilitycache', [str(q.pk) for q in evquotas])
|
||||
for redisval, q in zip(d, evquotas):
|
||||
if redisval is not None:
|
||||
data = [rv for rv in redisval.decode().split(',')]
|
||||
@@ -167,12 +164,12 @@ class QuotaAvailability:
|
||||
if not settings.HAS_REDIS or not quotas:
|
||||
return
|
||||
|
||||
rc = django_redis.get_redis_connection("redis")
|
||||
rc = get_redis_connection("redis")
|
||||
# We write the computed availability to redis in a per-event hash as
|
||||
#
|
||||
# quota_id -> (availability_state, availability_number, timestamp).
|
||||
#
|
||||
# We store this in a hash instead of individual values to avoid making too many redis requests
|
||||
# We store this in a hash instead of inidividual values to avoid making two many redis requests
|
||||
# which would introduce latency.
|
||||
|
||||
# The individual entries in the hash are "valid" for 120 seconds. This means in a typical peak scenario with
|
||||
@@ -182,16 +179,16 @@ class QuotaAvailability:
|
||||
# these quotas. We choose 10 seconds since that should be well above the duration of a write.
|
||||
|
||||
lock_name = '_'.join([str(p) for p in sorted([q.pk for q in quotas])])
|
||||
if rc.exists(f'quotas:availabilitycachewrite:{lock_name}{self._cache_key_suffix}'):
|
||||
if rc.exists(f'quotas:availabilitycachewrite:{lock_name}'):
|
||||
return
|
||||
rc.setex(f'quotas:availabilitycachewrite:{lock_name}{self._cache_key_suffix}', '1', 10)
|
||||
rc.setex(f'quotas:availabilitycachewrite:{lock_name}', '1', 10)
|
||||
|
||||
update = defaultdict(list)
|
||||
for q in quotas:
|
||||
update[q.event_id].append(q)
|
||||
|
||||
for eventid, quotas in update.items():
|
||||
rc.hset(f'quotas:{eventid}:availabilitycache{self._cache_key_suffix}', mapping={
|
||||
rc.hmset(f'quotas:{eventid}:availabilitycache', {
|
||||
str(q.id): ",".join(
|
||||
[str(i) for i in self.results[q]] +
|
||||
[str(int(time.time()))]
|
||||
@@ -200,7 +197,7 @@ class QuotaAvailability:
|
||||
# To make sure old events do not fill up our redis instance, we set an expiry on the cache. However, we set it
|
||||
# on 7 days even though we mostly ignore values older than 2 monites. The reasoning is that we have some places
|
||||
# where we set allow_cache_stale and use the old entries anyways to save on performance.
|
||||
rc.expire(f'quotas:{eventid}:availabilitycache{self._cache_key_suffix}', 3600 * 24 * 7)
|
||||
rc.expire(f'quotas:{eventid}:availabilitycache', 3600 * 24 * 7)
|
||||
|
||||
# We used to also delete item_quota_cache:* from the event cache here, but as the cache
|
||||
# gets more complex, this does not seem worth it. The cache is only present for up to
|
||||
|
||||
@@ -22,19 +22,15 @@
|
||||
import sys
|
||||
from datetime import timedelta
|
||||
|
||||
from django.db import transaction
|
||||
from django.db.models import (
|
||||
Exists, F, OuterRef, Prefetch, Q, Sum, prefetch_related_objects,
|
||||
)
|
||||
from django.db.models import Exists, F, OuterRef, Q, Sum
|
||||
from django.dispatch import receiver
|
||||
from django.utils.timezone import now
|
||||
from django_scopes import scopes_disabled
|
||||
|
||||
from pretix.base.models import (
|
||||
Event, EventMetaValue, SeatCategoryMapping, User, WaitingListEntry,
|
||||
Event, SeatCategoryMapping, User, WaitingListEntry,
|
||||
)
|
||||
from pretix.base.models.waitinglist import WaitingListException
|
||||
from pretix.base.services.locking import lock_objects
|
||||
from pretix.base.services.tasks import EventTask
|
||||
from pretix.base.signals import periodic_task
|
||||
from pretix.celery_app import app
|
||||
@@ -63,21 +59,8 @@ def assign_automatically(event: Event, user_id: int=None, subevent_id: int=None)
|
||||
).aggregate(free=Sum(F('max_usages') - F('redeemed')))['free'] or 0
|
||||
seats_available[(m.product_id, m.subevent_id)] = num_free_seets_for_product - num_valid_vouchers_for_product
|
||||
|
||||
prefetch_related_objects(
|
||||
[event.organizer],
|
||||
'meta_properties'
|
||||
)
|
||||
prefetch_related_objects(
|
||||
[event],
|
||||
Prefetch(
|
||||
'meta_values',
|
||||
EventMetaValue.objects.select_related('property'),
|
||||
to_attr='meta_values_cached'
|
||||
)
|
||||
)
|
||||
|
||||
qs = event.waitinglistentries.filter(
|
||||
voucher__isnull=True
|
||||
qs = WaitingListEntry.objects.filter(
|
||||
event=event, voucher__isnull=True
|
||||
).select_related('item', 'variation', 'subevent').prefetch_related(
|
||||
'item__quotas', 'variation__quotas'
|
||||
).order_by('-priority', 'created')
|
||||
@@ -88,23 +71,11 @@ def assign_automatically(event: Event, user_id: int=None, subevent_id: int=None)
|
||||
|
||||
sent = 0
|
||||
|
||||
with transaction.atomic(durable=True):
|
||||
quotas_by_item = {}
|
||||
quotas = set()
|
||||
for wle in qs:
|
||||
if (wle.item_id, wle.variation_id, wle.subevent_id) not in quotas_by_item:
|
||||
quotas_by_item[wle.item_id, wle.variation_id, wle.subevent_id] = list(
|
||||
wle.variation.quotas.filter(subevent=wle.subevent)
|
||||
if wle.variation
|
||||
else wle.item.quotas.filter(subevent=wle.subevent)
|
||||
)
|
||||
wle._quotas = quotas_by_item[wle.item_id, wle.variation_id, wle.subevent_id]
|
||||
quotas |= set(wle._quotas)
|
||||
|
||||
lock_objects(quotas, shared_lock_objects=[event])
|
||||
with event.lock():
|
||||
for wle in qs:
|
||||
if (wle.item, wle.variation, wle.subevent) in gone:
|
||||
continue
|
||||
|
||||
ev = (wle.subevent or event)
|
||||
if not ev.presale_is_running or (wle.subevent and not wle.subevent.active):
|
||||
continue
|
||||
@@ -119,6 +90,9 @@ def assign_automatically(event: Event, user_id: int=None, subevent_id: int=None)
|
||||
gone.add((wle.item, wle.variation, wle.subevent))
|
||||
continue
|
||||
|
||||
quotas = (wle.variation.quotas.filter(subevent=wle.subevent)
|
||||
if wle.variation
|
||||
else wle.item.quotas.filter(subevent=wle.subevent))
|
||||
availability = (
|
||||
wle.variation.check_quotas(count_waitinglist=False, _cache=quota_cache, subevent=wle.subevent)
|
||||
if wle.variation
|
||||
@@ -132,7 +106,7 @@ def assign_automatically(event: Event, user_id: int=None, subevent_id: int=None)
|
||||
continue
|
||||
|
||||
# Reduce affected quotas in cache
|
||||
for q in wle._quotas:
|
||||
for q in quotas:
|
||||
quota_cache[q.pk] = (
|
||||
quota_cache[q.pk][0] if quota_cache[q.pk][0] > 1 else 0,
|
||||
quota_cache[q.pk][1] - 1 if quota_cache[q.pk][1] is not None else sys.maxsize
|
||||
|
||||
File diff suppressed because one or more lines are too long
@@ -210,8 +210,6 @@ def slow_delete(qs, batch_size=1000, sleep_time=.5, progress_callback=None, prog
|
||||
break
|
||||
if total_deleted >= 0.8 * batch_size:
|
||||
time.sleep(sleep_time)
|
||||
if progress_callback and progress_total:
|
||||
progress_callback((progress_offset + total_deleted) / progress_total)
|
||||
return total_deleted
|
||||
|
||||
|
||||
|
||||
@@ -683,16 +683,12 @@ dictionaries as values that contain keys like in the following example::
|
||||
"product": {
|
||||
"label": _("Product name"),
|
||||
"editor_sample": _("Sample product"),
|
||||
"evaluate": lambda orderposition, order, event: str(orderposition.item),
|
||||
"evaluate_bulk": lambda orderpositions: [str(op.item) for op in orderpositions],
|
||||
"evaluate": lambda orderposition, order, event: str(orderposition.item)
|
||||
}
|
||||
}
|
||||
|
||||
The ``evaluate`` member will be called with the order position, order and event as arguments. The event might
|
||||
also be a subevent, if applicable.
|
||||
|
||||
The ``evaluate_bulk`` member is optional but can significantly improve performance in some situations because you
|
||||
can perform database fetches in bulk instead of single queries for every position.
|
||||
"""
|
||||
|
||||
|
||||
@@ -791,23 +787,3 @@ return a dictionary mapping names of attributes in the settings store to DRF ser
|
||||
|
||||
As with all event-plugin signals, the ``sender`` keyword argument will contain the event.
|
||||
"""
|
||||
|
||||
customer_created = GlobalSignal()
|
||||
"""
|
||||
Arguments: ``customer``
|
||||
|
||||
This signal is sent out every time a customer account is created. The ``customer``
|
||||
object is given as the first argument.
|
||||
|
||||
The ``sender`` keyword argument will contain the organizer.
|
||||
"""
|
||||
|
||||
customer_signed_in = GlobalSignal()
|
||||
"""
|
||||
Arguments: ``customer``
|
||||
|
||||
This signal is sent out every time a customer signs in. The ``customer`` object
|
||||
is given as the first argument.
|
||||
|
||||
The ``sender`` keyword argument will contain the organizer.
|
||||
"""
|
||||
|
||||
@@ -48,8 +48,6 @@ from django.utils.http import url_has_allowed_host_and_scheme
|
||||
from django.utils.safestring import mark_safe
|
||||
from markdown import Extension
|
||||
from markdown.inlinepatterns import SubstituteTagInlineProcessor
|
||||
from markdown.postprocessors import Postprocessor
|
||||
from markdown.treeprocessors import UnescapeTreeprocessor
|
||||
from tlds import tld_set
|
||||
|
||||
register = template.Library()
|
||||
@@ -110,8 +108,6 @@ URL_RE = SimpleLazyObject(lambda: build_url_re(tlds=sorted(tld_set, key=len, rev
|
||||
|
||||
EMAIL_RE = SimpleLazyObject(lambda: build_email_re(tlds=sorted(tld_set, key=len, reverse=True)))
|
||||
|
||||
DOT_ESCAPE = "|escaped-dot-sGnY9LMK|"
|
||||
|
||||
|
||||
def safelink_callback(attrs, new=False):
|
||||
"""
|
||||
@@ -189,130 +185,25 @@ class EmailNl2BrExtension(Extension):
|
||||
md.inlinePatterns.register(br_tag, 'nl', 5)
|
||||
|
||||
|
||||
class LinkifyPostprocessor(Postprocessor):
|
||||
def __init__(self, linker):
|
||||
self.linker = linker
|
||||
super().__init__()
|
||||
|
||||
def run(self, text):
|
||||
return self.linker.linkify(text)
|
||||
|
||||
|
||||
class CleanPostprocessor(Postprocessor):
|
||||
def __init__(self, tags, attributes, protocols, strip):
|
||||
self.tags = tags
|
||||
self.attributes = attributes
|
||||
self.protocols = protocols
|
||||
self.strip = strip
|
||||
super().__init__()
|
||||
|
||||
def run(self, text):
|
||||
return bleach.clean(
|
||||
text,
|
||||
tags=self.tags,
|
||||
attributes=self.attributes,
|
||||
protocols=self.protocols,
|
||||
strip=self.strip
|
||||
)
|
||||
|
||||
|
||||
class CustomUnescapeTreeprocessor(UnescapeTreeprocessor):
|
||||
"""
|
||||
This un-escapes everything except \\.
|
||||
"""
|
||||
|
||||
def _unescape(self, m):
|
||||
if m.group(1) == "46": # 46 is the ASCII position of .
|
||||
return DOT_ESCAPE
|
||||
return chr(int(m.group(1)))
|
||||
|
||||
|
||||
class CustomUnescapePostprocessor(Postprocessor):
|
||||
"""
|
||||
Restore escaped .
|
||||
"""
|
||||
|
||||
def run(self, text):
|
||||
return text.replace(DOT_ESCAPE, ".")
|
||||
|
||||
|
||||
class LinkifyAndCleanExtension(Extension):
|
||||
r"""
|
||||
We want to do:
|
||||
|
||||
input --> markdown --> bleach clean --> linkify --> output
|
||||
|
||||
Internally, the markdown library does:
|
||||
|
||||
source --> parse --> (tree|inline)processors --> serializing --> postprocessors
|
||||
|
||||
All escaped characters such as \. will be turned to something like <STX>46<ETX> in the processors
|
||||
step and then will be converted to . back again in the last tree processor, before serialization.
|
||||
Therefore, linkify does not see the escaped character anymore. This is annoying for the one case
|
||||
where you want to type "rich_text.py" and *not* have it turned into a link, since you can't type
|
||||
"rich_text\.py" either.
|
||||
|
||||
A simple solution would be to run linkify before markdown, but that may cause other issues when
|
||||
linkify messes with the markdown syntax and it makes handling our attributes etc. harder.
|
||||
|
||||
So we do a weird hack where we modify the unescape processor to unescape everything EXCEPT for the
|
||||
dot and then unescape that one manually after linkify. However, to make things even harder, the bleach
|
||||
clean step removes any invisible characters, so we need to cheat a bit more.
|
||||
"""
|
||||
|
||||
def __init__(self, linker, tags, attributes, protocols, strip):
|
||||
self.linker = linker
|
||||
self.tags = tags
|
||||
self.attributes = attributes
|
||||
self.protocols = protocols
|
||||
self.strip = strip
|
||||
super().__init__()
|
||||
|
||||
def extendMarkdown(self, md):
|
||||
md.treeprocessors.deregister('unescape')
|
||||
md.treeprocessors.register(
|
||||
CustomUnescapeTreeprocessor(md),
|
||||
'unescape',
|
||||
0
|
||||
)
|
||||
md.postprocessors.register(
|
||||
CleanPostprocessor(self.tags, self.attributes, self.protocols, self.strip),
|
||||
'clean',
|
||||
2
|
||||
)
|
||||
md.postprocessors.register(
|
||||
LinkifyPostprocessor(self.linker),
|
||||
'linkify',
|
||||
1
|
||||
)
|
||||
md.postprocessors.register(
|
||||
CustomUnescapePostprocessor(self.linker),
|
||||
'unescape_dot',
|
||||
0
|
||||
)
|
||||
|
||||
|
||||
def markdown_compile_email(source, allowed_tags=ALLOWED_TAGS, allowed_attributes=ALLOWED_ATTRIBUTES):
|
||||
def markdown_compile_email(source):
|
||||
linker = bleach.Linker(
|
||||
url_re=URL_RE,
|
||||
email_re=EMAIL_RE,
|
||||
callbacks=DEFAULT_CALLBACKS + [truelink_callback, abslink_callback],
|
||||
parse_email=True
|
||||
)
|
||||
return markdown.markdown(
|
||||
source,
|
||||
extensions=[
|
||||
'markdown.extensions.sane_lists',
|
||||
EmailNl2BrExtension(),
|
||||
LinkifyAndCleanExtension(
|
||||
linker,
|
||||
tags=allowed_tags,
|
||||
attributes=allowed_attributes,
|
||||
protocols=ALLOWED_PROTOCOLS,
|
||||
strip=False,
|
||||
)
|
||||
]
|
||||
)
|
||||
return linker.linkify(bleach.clean(
|
||||
markdown.markdown(
|
||||
source,
|
||||
extensions=[
|
||||
'markdown.extensions.sane_lists',
|
||||
EmailNl2BrExtension(),
|
||||
]
|
||||
),
|
||||
tags=ALLOWED_TAGS,
|
||||
attributes=ALLOWED_ATTRIBUTES,
|
||||
protocols=ALLOWED_PROTOCOLS,
|
||||
))
|
||||
|
||||
|
||||
class SnippetExtension(markdown.extensions.Extension):
|
||||
@@ -322,24 +213,23 @@ class SnippetExtension(markdown.extensions.Extension):
|
||||
md.parser.blockprocessors.deregister('quote')
|
||||
|
||||
|
||||
def markdown_compile(source, linker, snippet=False):
|
||||
def markdown_compile(source, snippet=False):
|
||||
tags = ALLOWED_TAGS_SNIPPET if snippet else ALLOWED_TAGS
|
||||
exts = [
|
||||
'markdown.extensions.sane_lists',
|
||||
'markdown.extensions.nl2br',
|
||||
LinkifyAndCleanExtension(
|
||||
linker,
|
||||
tags=tags,
|
||||
attributes=ALLOWED_ATTRIBUTES,
|
||||
protocols=ALLOWED_PROTOCOLS,
|
||||
strip=snippet,
|
||||
)
|
||||
'markdown.extensions.nl2br'
|
||||
]
|
||||
if snippet:
|
||||
exts.append(SnippetExtension())
|
||||
return markdown.markdown(
|
||||
source,
|
||||
extensions=exts
|
||||
return bleach.clean(
|
||||
markdown.markdown(
|
||||
source,
|
||||
extensions=exts
|
||||
),
|
||||
strip=snippet,
|
||||
tags=tags,
|
||||
attributes=ALLOWED_ATTRIBUTES,
|
||||
protocols=ALLOWED_PROTOCOLS,
|
||||
)
|
||||
|
||||
|
||||
@@ -355,7 +245,7 @@ def rich_text(text: str, **kwargs):
|
||||
callbacks=DEFAULT_CALLBACKS + ([truelink_callback, safelink_callback] if kwargs.get('safelinks', True) else [truelink_callback, abslink_callback]),
|
||||
parse_email=True
|
||||
)
|
||||
body_md = markdown_compile(text, linker)
|
||||
body_md = linker.linkify(markdown_compile(text))
|
||||
return mark_safe(body_md)
|
||||
|
||||
|
||||
@@ -371,5 +261,5 @@ def rich_text_snippet(text: str, **kwargs):
|
||||
callbacks=DEFAULT_CALLBACKS + ([truelink_callback, safelink_callback] if kwargs.get('safelinks', True) else [truelink_callback, abslink_callback]),
|
||||
parse_email=True
|
||||
)
|
||||
body_md = markdown_compile(text, linker, snippet=True)
|
||||
body_md = linker.linkify(markdown_compile(text, snippet=True))
|
||||
return mark_safe(body_md)
|
||||
|
||||
@@ -394,8 +394,6 @@ class SerializerDateFrameField(serializers.CharField):
|
||||
resolve_timeframe_to_dates_inclusive(now(), data, timezone.utc)
|
||||
except:
|
||||
raise ValidationError("Invalid date frame")
|
||||
else:
|
||||
return data
|
||||
|
||||
def to_representation(self, value):
|
||||
if value is None:
|
||||
|
||||
@@ -1,53 +0,0 @@
|
||||
#
|
||||
# This file is part of pretix (Community Edition).
|
||||
#
|
||||
# Copyright (C) 2014-2020 Raphael Michel and contributors
|
||||
# Copyright (C) 2020-2021 rami.io GmbH and contributors
|
||||
#
|
||||
# This program is free software: you can redistribute it and/or modify it under the terms of the GNU Affero General
|
||||
# Public License as published by the Free Software Foundation in version 3 of the License.
|
||||
#
|
||||
# ADDITIONAL TERMS APPLY: Pursuant to Section 7 of the GNU Affero General Public License, additional terms are
|
||||
# applicable granting you additional permissions and placing additional restrictions on your usage of this software.
|
||||
# Please refer to the pretix LICENSE file to obtain the full terms applicable to this work. If you did not receive
|
||||
# this file, see <https://pretix.eu/about/en/license>.
|
||||
#
|
||||
# This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied
|
||||
# warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Affero General Public License for more
|
||||
# details.
|
||||
#
|
||||
# You should have received a copy of the GNU Affero General Public License along with this program. If not, see
|
||||
# <https://www.gnu.org/licenses/>.
|
||||
#
|
||||
from django.http import Http404, HttpResponse
|
||||
|
||||
from pretix.base.settings import GlobalSettingsObject
|
||||
|
||||
|
||||
def association(request, *args, **kwargs):
|
||||
# This is a crutch to enable event- or organizer-level overrides for the default
|
||||
# ApplePay MerchantID domain validation/association file.
|
||||
# We do not provide any FormFields for this on purpose!
|
||||
#
|
||||
# Please refer to https://github.com/pretix/pretix/pull/3611 to get updates on
|
||||
# the upcoming and official way to temporarily override the association-file,
|
||||
# which will make sure that there are no conflicting requests at the same time.
|
||||
#
|
||||
# Should you opt to manually inject a different association-file into an organizer
|
||||
# or event settings store, we do recommend to remove the setting once you're
|
||||
# done and the domain has been validated.
|
||||
#
|
||||
# If you do not need Stripe's default domain association credential and would
|
||||
# rather serve a different default credential, you can do so through the
|
||||
# Global Settings editor.
|
||||
if hasattr(request, 'event'):
|
||||
settings = request.event.settings
|
||||
elif hasattr(request, 'organizer'):
|
||||
settings = request.organizer.settings
|
||||
else:
|
||||
settings = GlobalSettingsObject().settings
|
||||
|
||||
if not settings.get('apple_domain_association', None):
|
||||
raise Http404('')
|
||||
else:
|
||||
return HttpResponse(settings.get('apple_domain_association'))
|
||||
@@ -29,7 +29,6 @@ from celery.result import AsyncResult
|
||||
from django.conf import settings
|
||||
from django.contrib import messages
|
||||
from django.core.exceptions import PermissionDenied, ValidationError
|
||||
from django.db import transaction
|
||||
from django.http import HttpResponse, JsonResponse, QueryDict
|
||||
from django.shortcuts import redirect, render
|
||||
from django.test import RequestFactory
|
||||
@@ -218,7 +217,6 @@ class AsyncFormView(AsyncMixin, FormView):
|
||||
known_errortypes = ['ValidationError']
|
||||
expected_exceptions = (ValidationError,)
|
||||
task_base = ProfiledEventTask
|
||||
atomic_execute = False
|
||||
|
||||
def async_set_progress(self, percentage):
|
||||
if not self._task_self.request.called_directly:
|
||||
@@ -265,9 +263,6 @@ class AsyncFormView(AsyncMixin, FormView):
|
||||
form.is_valid()
|
||||
return view_instance.async_form_valid(self, form)
|
||||
|
||||
if cls.atomic_execute:
|
||||
async_execute = transaction.atomic(async_execute)
|
||||
|
||||
cls.async_execute = app.task(
|
||||
base=cls.task_base,
|
||||
bind=True,
|
||||
|
||||
@@ -44,7 +44,6 @@ from django.forms.utils import from_current_timezone
|
||||
from django.urls import reverse
|
||||
from django.utils.html import escape
|
||||
from django.utils.safestring import mark_safe
|
||||
from django.utils.text import format_lazy
|
||||
from django.utils.timezone import now
|
||||
from django.utils.translation import gettext_lazy as _
|
||||
from django_scopes.forms import SafeModelMultipleChoiceField
|
||||
@@ -52,7 +51,6 @@ from django_scopes.forms import SafeModelMultipleChoiceField
|
||||
from pretix.helpers.hierarkey import clean_filename
|
||||
|
||||
from ...base.forms import I18nModelForm
|
||||
from ...helpers.i18n import get_language_score
|
||||
from ...helpers.images import (
|
||||
IMAGE_EXTS, validate_uploaded_file_for_valid_image,
|
||||
)
|
||||
@@ -302,44 +300,18 @@ class SlugWidget(forms.TextInput):
|
||||
|
||||
|
||||
class MultipleLanguagesWidget(forms.CheckboxSelectMultiple):
|
||||
template_name = 'pretixcontrol/multi_languages_select.html'
|
||||
option_template_name = 'pretixcontrol/multi_languages_widget.html'
|
||||
|
||||
def sort(self):
|
||||
def filter_and_sort(choices, languages, cond=True):
|
||||
return sorted(
|
||||
[c for c in choices if (c[0] in languages) == cond],
|
||||
key=lambda c: str(c[1])
|
||||
)
|
||||
self.choices = (
|
||||
self.choices = sorted(self.choices, key=lambda l: (
|
||||
(
|
||||
'',
|
||||
filter_and_sort(self.choices, settings.LANGUAGES_OFFICIAL)
|
||||
),
|
||||
(
|
||||
(
|
||||
_('Community translations'),
|
||||
format_lazy(
|
||||
_('These translations are not maintained by the pretix team. We cannot vouch for their correctness '
|
||||
'and new or recently changed features might not be translated and will show in English instead. '
|
||||
'You can <a href="{translate_url}" target="_blank">help translating</a>.'),
|
||||
translate_url='https://translate.pretix.eu'
|
||||
),
|
||||
'fa fa-group'
|
||||
),
|
||||
filter_and_sort(self.choices, settings.LANGUAGES_OFFICIAL.union(settings.LANGUAGES_INCUBATING), False)
|
||||
),
|
||||
(
|
||||
(
|
||||
_('Development only'),
|
||||
_('These translations are still in progress. These languages can currently only be selected on development '
|
||||
'installations of pretix, not in production.'),
|
||||
'fa fa-flask text-danger'
|
||||
),
|
||||
filter_and_sort(self.choices, settings.LANGUAGES_INCUBATING)
|
||||
)
|
||||
)
|
||||
self.choices = [c for c in self.choices if len(c[1])]
|
||||
0 if l[0] in settings.LANGUAGES_OFFICIAL
|
||||
else (
|
||||
1 if l[0] not in settings.LANGUAGES_INCUBATING
|
||||
else 2
|
||||
)
|
||||
), str(l[1])
|
||||
))
|
||||
|
||||
def options(self, name, value, attrs=None):
|
||||
self.sort()
|
||||
@@ -353,8 +325,6 @@ class MultipleLanguagesWidget(forms.CheckboxSelectMultiple):
|
||||
opt = super().create_option(name, value, label, selected, index, subindex, attrs)
|
||||
opt['official'] = value in settings.LANGUAGES_OFFICIAL
|
||||
opt['incubating'] = value in settings.LANGUAGES_INCUBATING
|
||||
base_score = get_language_score("de")
|
||||
opt['score'] = round(get_language_score(value) / base_score * 100)
|
||||
return opt
|
||||
|
||||
|
||||
|
||||
@@ -32,7 +32,6 @@ from django_scopes.forms import (
|
||||
|
||||
from pretix.base.channels import get_all_sales_channels
|
||||
from pretix.base.forms.widgets import SplitDateTimePickerWidget
|
||||
from pretix.base.models import Gate
|
||||
from pretix.base.models.checkin import Checkin, CheckinList
|
||||
from pretix.control.forms import ItemMultipleChoiceField
|
||||
from pretix.control.forms.widgets import Select2
|
||||
@@ -202,26 +201,3 @@ class CheckinListSimulatorForm(forms.Form):
|
||||
initial=True,
|
||||
required=False,
|
||||
)
|
||||
gate = SafeModelChoiceField(
|
||||
label=_('Gate'),
|
||||
empty_label=_('All gates'),
|
||||
queryset=Gate.objects.none(),
|
||||
required=False
|
||||
)
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
self.event = kwargs.pop('event')
|
||||
super().__init__(*args, **kwargs)
|
||||
|
||||
self.fields['gate'].queryset = self.event.organizer.gates.all()
|
||||
self.fields['gate'].widget = Select2(
|
||||
attrs={
|
||||
'data-model-select2': 'generic',
|
||||
'data-select2-url': reverse('control:organizer.gates.select2', kwargs={
|
||||
'organizer': self.event.organizer.slug,
|
||||
}),
|
||||
'data-placeholder': _('All gates'),
|
||||
}
|
||||
)
|
||||
self.fields['gate'].widget.choices = self.fields['gate'].choices
|
||||
self.fields['gate'].label = _('Gate')
|
||||
|
||||
@@ -50,16 +50,11 @@ class DiscountForm(I18nModelForm):
|
||||
'condition_ignore_voucher_discounted',
|
||||
'benefit_discount_matching_percent',
|
||||
'benefit_only_apply_to_cheapest_n_matches',
|
||||
'benefit_same_products',
|
||||
'benefit_limit_products',
|
||||
'benefit_apply_to_addons',
|
||||
'benefit_ignore_voucher_discounted',
|
||||
]
|
||||
field_classes = {
|
||||
'available_from': SplitDateTimeField,
|
||||
'available_until': SplitDateTimeField,
|
||||
'condition_limit_products': ItemMultipleChoiceField,
|
||||
'benefit_limit_products': ItemMultipleChoiceField,
|
||||
}
|
||||
widgets = {
|
||||
'subevent_mode': forms.RadioSelect,
|
||||
@@ -69,14 +64,11 @@ class DiscountForm(I18nModelForm):
|
||||
'data-inverse-dependency': '<[name$=all_products]',
|
||||
'class': 'scrolling-multiple-choice',
|
||||
}),
|
||||
'benefit_limit_products': forms.CheckboxSelectMultiple(attrs={
|
||||
'class': 'scrolling-multiple-choice',
|
||||
}),
|
||||
'benefit_only_apply_to_cheapest_n_matches': forms.NumberInput(
|
||||
attrs={
|
||||
'data-display-dependency': '#id_condition_min_count',
|
||||
}
|
||||
),
|
||||
)
|
||||
}
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
@@ -93,7 +85,6 @@ class DiscountForm(I18nModelForm):
|
||||
widget=forms.CheckboxSelectMultiple,
|
||||
)
|
||||
self.fields['condition_limit_products'].queryset = self.event.items.all()
|
||||
self.fields['benefit_limit_products'].queryset = self.event.items.all()
|
||||
self.fields['condition_min_count'].required = False
|
||||
self.fields['condition_min_count'].widget.is_required = False
|
||||
self.fields['condition_min_value'].required = False
|
||||
|
||||
@@ -38,7 +38,6 @@ from decimal import Decimal
|
||||
from urllib.parse import urlencode, urlparse
|
||||
from zoneinfo import ZoneInfo
|
||||
|
||||
import pycountry
|
||||
from django import forms
|
||||
from django.conf import settings
|
||||
from django.core.exceptions import NON_FIELD_ERRORS, ValidationError
|
||||
@@ -49,7 +48,7 @@ from django.forms import (
|
||||
)
|
||||
from django.urls import reverse
|
||||
from django.utils.functional import cached_property
|
||||
from django.utils.html import escape, format_html
|
||||
from django.utils.html import escape
|
||||
from django.utils.safestring import mark_safe
|
||||
from django.utils.timezone import get_current_timezone_name
|
||||
from django.utils.translation import gettext, gettext_lazy as _, pgettext_lazy
|
||||
@@ -62,13 +61,11 @@ from pytz import common_timezones
|
||||
from pretix.base.channels import get_all_sales_channels
|
||||
from pretix.base.email import get_available_placeholders
|
||||
from pretix.base.forms import I18nModelForm, PlaceholderValidator, SettingsForm
|
||||
from pretix.base.forms.widgets import format_placeholders_help_text
|
||||
from pretix.base.models import Event, Organizer, TaxRule, Team
|
||||
from pretix.base.models.event import EventFooterLink, EventMetaValue, SubEvent
|
||||
from pretix.base.reldate import RelativeDateField, RelativeDateTimeField
|
||||
from pretix.base.settings import (
|
||||
COUNTRIES_WITH_STATE_IN_ADDRESS, DEFAULTS, PERSON_NAME_SCHEMES,
|
||||
PERSON_NAME_TITLE_GROUPS, validate_event_settings,
|
||||
PERSON_NAME_SCHEMES, PERSON_NAME_TITLE_GROUPS, validate_event_settings,
|
||||
)
|
||||
from pretix.base.validators import multimail_validate
|
||||
from pretix.control.forms import (
|
||||
@@ -752,7 +749,6 @@ class PaymentSettingsForm(EventSettingsValidationMixin, SettingsForm):
|
||||
'payment_term_minutes',
|
||||
'payment_term_last',
|
||||
'payment_term_expire_automatically',
|
||||
'payment_term_expire_delay_days',
|
||||
'payment_term_accept_late',
|
||||
'payment_pending_hidden',
|
||||
'payment_explanation',
|
||||
@@ -901,27 +897,6 @@ class InvoiceSettingsForm(EventSettingsValidationMixin, SettingsForm):
|
||||
)
|
||||
self.fields['invoice_numbers_counter_length'].validators.append(MaxValueValidator(15))
|
||||
|
||||
pps = [str(pp.verbose_name) for pp in event.get_payment_providers().values() if pp.requires_invoice_immediately]
|
||||
if pps:
|
||||
generate_paid_help_text = _('An invoice will be issued before payment if the customer selects one of the following payment methods: {list}').format(
|
||||
list=', '.join(pps)
|
||||
)
|
||||
else:
|
||||
generate_paid_help_text = _('None of the currently configured payment methods will cause an invoice to be issued before payment.')
|
||||
|
||||
generate_choices = list(DEFAULTS['invoice_generate']['form_kwargs']['choices'])
|
||||
idx = [i for i, t in enumerate(generate_choices) if t[0] == 'paid'][0]
|
||||
generate_choices[idx] = (
|
||||
'paid',
|
||||
format_html(
|
||||
'{} <span class="label label-success">{}</span><br><span class="text-muted">{}</span>',
|
||||
generate_choices[idx][1],
|
||||
_('Recommended'),
|
||||
generate_paid_help_text
|
||||
)
|
||||
)
|
||||
self.fields['invoice_generate'].choices = generate_choices
|
||||
|
||||
|
||||
def contains_web_channel_validate(val):
|
||||
if "web" not in val:
|
||||
@@ -1140,16 +1115,6 @@ class MailSettingsForm(SettingsForm):
|
||||
help_text=_("This email only applies to payment methods that can receive incomplete payments, "
|
||||
"such as bank transfer."),
|
||||
)
|
||||
mail_subject_order_payment_failed = I18nFormField(
|
||||
label=_("Subject"),
|
||||
required=False,
|
||||
widget=I18nTextInput,
|
||||
)
|
||||
mail_text_order_payment_failed = I18nFormField(
|
||||
label=_("Text"),
|
||||
required=False,
|
||||
widget=I18nTextarea,
|
||||
)
|
||||
mail_subject_waiting_list = I18nFormField(
|
||||
label=_("Subject"),
|
||||
required=False,
|
||||
@@ -1297,12 +1262,12 @@ class MailSettingsForm(SettingsForm):
|
||||
'mail_subject_order_placed_require_approval': ['event', 'order'],
|
||||
'mail_text_order_approved': ['event', 'order'],
|
||||
'mail_subject_order_approved': ['event', 'order'],
|
||||
'mail_text_order_approved_attendee': ['event', 'order', 'position'],
|
||||
'mail_subject_order_approved_attendee': ['event', 'order', 'position'],
|
||||
'mail_text_order_approved_attendee': ['event', 'order'],
|
||||
'mail_subject_order_approved_attendee': ['event', 'order'],
|
||||
'mail_text_order_approved_free': ['event', 'order'],
|
||||
'mail_subject_order_approved_free': ['event', 'order'],
|
||||
'mail_text_order_approved_free_attendee': ['event', 'order', 'position'],
|
||||
'mail_subject_order_approved_free_attendee': ['event', 'order', 'position'],
|
||||
'mail_text_order_approved_free_attendee': ['event', 'order'],
|
||||
'mail_subject_order_approved_free_attendee': ['event', 'order'],
|
||||
'mail_text_order_denied': ['event', 'order', 'comment'],
|
||||
'mail_subject_order_denied': ['event', 'order', 'comment'],
|
||||
'mail_text_order_paid': ['event', 'order', 'payment_info'],
|
||||
@@ -1323,8 +1288,6 @@ class MailSettingsForm(SettingsForm):
|
||||
'mail_subject_order_pending_warning': ['event', 'order'],
|
||||
'mail_text_order_incomplete_payment': ['event', 'order', 'pending_sum'],
|
||||
'mail_subject_order_incomplete_payment': ['event', 'order'],
|
||||
'mail_text_order_payment_failed': ['event', 'order'],
|
||||
'mail_subject_order_payment_failed': ['event', 'order'],
|
||||
'mail_text_order_custom_mail': ['event', 'order'],
|
||||
'mail_text_download_reminder': ['event', 'order'],
|
||||
'mail_subject_download_reminder': ['event', 'order'],
|
||||
@@ -1332,7 +1295,7 @@ class MailSettingsForm(SettingsForm):
|
||||
'mail_subject_download_reminder_attendee': ['event', 'order', 'position'],
|
||||
'mail_text_resend_link': ['event', 'order'],
|
||||
'mail_subject_resend_link': ['event', 'order'],
|
||||
'mail_subject_resend_link_attendee': ['event', 'order', 'position'],
|
||||
'mail_subject_resend_link_attendee': ['event', 'order'],
|
||||
'mail_text_waiting_list': ['event', 'waiting_list_entry', 'waiting_list_voucher'],
|
||||
'mail_subject_waiting_list': ['event', 'waiting_list_entry', 'waiting_list_voucher'],
|
||||
'mail_text_resend_all_links': ['event', 'orders'],
|
||||
@@ -1341,14 +1304,19 @@ class MailSettingsForm(SettingsForm):
|
||||
}
|
||||
|
||||
def _set_field_placeholders(self, fn, base_parameters):
|
||||
placeholders = get_available_placeholders(self.event, base_parameters)
|
||||
ht = format_placeholders_help_text(placeholders, self.event)
|
||||
phs = [
|
||||
'{%s}' % p
|
||||
for p in sorted(get_available_placeholders(self.event, base_parameters).keys())
|
||||
]
|
||||
ht = _('Available placeholders: {list}').format(
|
||||
list=', '.join(phs)
|
||||
)
|
||||
if self.fields[fn].help_text:
|
||||
self.fields[fn].help_text += ' ' + str(ht)
|
||||
else:
|
||||
self.fields[fn].help_text = ht
|
||||
self.fields[fn].validators.append(
|
||||
PlaceholderValidator(['{%s}' % p for p in placeholders.keys()])
|
||||
PlaceholderValidator(phs)
|
||||
)
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
@@ -1447,20 +1415,9 @@ class CountriesAndEU(CachedCountries):
|
||||
cache_subkey = 'with_any_or_eu'
|
||||
|
||||
|
||||
class CountriesAndEUAndStates(CountriesAndEU):
|
||||
def __iter__(self):
|
||||
for country_code, country_name in super().__iter__():
|
||||
yield country_code, country_name
|
||||
if country_code in COUNTRIES_WITH_STATE_IN_ADDRESS:
|
||||
types, form = COUNTRIES_WITH_STATE_IN_ADDRESS[country_code]
|
||||
yield from sorted(((state.code, country_name + " - " + state.name)
|
||||
for state in pycountry.subdivisions.get(country_code=country_code)
|
||||
if state.type in types), key=lambda s: s[1])
|
||||
|
||||
|
||||
class TaxRuleLineForm(I18nForm):
|
||||
country = LazyTypedChoiceField(
|
||||
choices=CountriesAndEUAndStates(),
|
||||
choices=CountriesAndEU(),
|
||||
required=False
|
||||
)
|
||||
address_type = forms.ChoiceField(
|
||||
|
||||
@@ -1732,8 +1732,8 @@ class CheckinListAttendeeFilterForm(FilterForm):
|
||||
'-timestamp': (OrderBy(F('last_entry'), nulls_last=True, descending=True), '-order__code'),
|
||||
'item': ('item__name', 'variation__value', 'order__code'),
|
||||
'-item': ('-item__name', '-variation__value', '-order__code'),
|
||||
'seat': ('seat__sorting_rank', 'seat__seat_guid'),
|
||||
'-seat': ('-seat__sorting_rank', '-seat__seat_guid'),
|
||||
'seat': ('seat__sorting_rank', 'seat__guid'),
|
||||
'-seat': ('-seat__sorting_rank', '-seat__guid'),
|
||||
'date': ('subevent__date_from', 'subevent__id', 'order__code'),
|
||||
'-date': ('-subevent__date_from', 'subevent__id', '-order__code'),
|
||||
'name': {'_order': F('display_name').asc(nulls_first=True),
|
||||
@@ -1940,7 +1940,7 @@ class VoucherFilterForm(FilterForm):
|
||||
'item__category__position',
|
||||
'item__category',
|
||||
'item__position',
|
||||
'variation__position',
|
||||
'item__variation__position',
|
||||
'quota__name',
|
||||
),
|
||||
'subevent': 'subevent__date_from',
|
||||
@@ -1950,7 +1950,7 @@ class VoucherFilterForm(FilterForm):
|
||||
'-item__category__position',
|
||||
'-item__category',
|
||||
'-item__position',
|
||||
'-variation__position',
|
||||
'-item__variation__position',
|
||||
'-quota__name',
|
||||
)
|
||||
}
|
||||
@@ -2397,61 +2397,6 @@ class CheckinFilterForm(FilterForm):
|
||||
return qs
|
||||
|
||||
|
||||
class CheckinListFilterForm(FilterForm):
|
||||
orders = {
|
||||
"name": ("name", "subevent__date_from", "pk"),
|
||||
"-name": ("-name", "-subevent__date_from", "-pk"),
|
||||
"subevent": ("subevent__date_from", "name", "pk"),
|
||||
"-subevent": ("-subevent__date_from", "-name", "-pk"),
|
||||
}
|
||||
subevent = forms.ModelChoiceField(
|
||||
label=pgettext_lazy('subevent', 'Date'),
|
||||
queryset=SubEvent.objects.none(),
|
||||
required=False,
|
||||
empty_label=pgettext_lazy('subevent', 'All dates')
|
||||
)
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
self.event = kwargs.pop("event")
|
||||
super().__init__(*args, **kwargs)
|
||||
if self.event.has_subevents:
|
||||
self.fields['subevent'].queryset = self.event.subevents.all()
|
||||
self.fields['subevent'].widget = Select2(
|
||||
attrs={
|
||||
'data-model-select2': 'event',
|
||||
'data-select2-url': reverse('control:event.subevents.select2', kwargs={
|
||||
'event': self.event.slug,
|
||||
'organizer': self.event.organizer.slug,
|
||||
}),
|
||||
'data-placeholder': pgettext_lazy('subevent', 'All dates')
|
||||
}
|
||||
)
|
||||
self.fields['subevent'].widget.choices = self.fields['subevent'].choices
|
||||
elif 'subevent':
|
||||
del self.fields['subevent']
|
||||
|
||||
def filter_qs(self, qs):
|
||||
fdata = self.cleaned_data
|
||||
|
||||
if fdata.get('subevent'):
|
||||
qs = qs.filter(subevent=fdata.get('subevent'))
|
||||
|
||||
if fdata.get("ordering"):
|
||||
ob = self.orders[fdata.get('ordering')]
|
||||
if isinstance(ob, dict):
|
||||
ob = dict(ob)
|
||||
o = ob.pop('_order')
|
||||
qs = qs.annotate(**ob).order_by(*get_deterministic_ordering(OrderPosition, [o]))
|
||||
elif isinstance(ob, (list, tuple)):
|
||||
qs = qs.order_by(*get_deterministic_ordering(OrderPosition, ob))
|
||||
else:
|
||||
qs = qs.order_by(*get_deterministic_ordering(OrderPosition, [ob]))
|
||||
else:
|
||||
qs = qs.order_by("subevent__date_from", "name", "pk")
|
||||
|
||||
return qs.distinct()
|
||||
|
||||
|
||||
class DeviceFilterForm(FilterForm):
|
||||
orders = {
|
||||
'name': Upper('name'),
|
||||
|
||||
@@ -38,7 +38,6 @@ from django import forms
|
||||
from django.utils.translation import gettext_lazy as _
|
||||
from i18nfield.forms import I18nFormField, I18nTextarea, I18nTextInput
|
||||
|
||||
from pretix import settings
|
||||
from pretix.base.forms import SecretKeySettingsField, SettingsForm
|
||||
from pretix.base.settings import GlobalSettingsObject
|
||||
from pretix.base.signals import register_global_settings
|
||||
@@ -87,22 +86,13 @@ class GlobalSettingsForm(SettingsForm):
|
||||
('leaflet_tiles', forms.CharField(
|
||||
required=False,
|
||||
label=_("Leaflet tiles URL pattern"),
|
||||
help_text=_("e.g. {sample}").format(sample="https://tile.openstreetmap.org/{z}/{x}/{y}.png")
|
||||
help_text=_("e.g. {sample}").format(sample="https://a.tile.openstreetmap.org/{z}/{x}/{y}.png")
|
||||
)),
|
||||
('leaflet_tiles_attribution', forms.CharField(
|
||||
required=False,
|
||||
label=_("Leaflet tiles attribution"),
|
||||
help_text=_("e.g. {sample}").format(
|
||||
sample='© <a href="https://www.openstreetmap.org/copyright">OpenStreetMap</a> contributors'
|
||||
)
|
||||
help_text=_("e.g. {sample}").format(sample='© <a href="https://www.openstreetmap.org/copyright">OpenStreetMap</a> contributors')
|
||||
)),
|
||||
('apple_domain_association', forms.CharField(
|
||||
required=False,
|
||||
label=_("ApplePay MerchantID Domain Association"),
|
||||
help_text=_("Will be served at {domain}/.well-known/apple-developer-merchantid-domain-association").format(
|
||||
domain=settings.SITE_URL
|
||||
)
|
||||
))
|
||||
])
|
||||
responses = register_global_settings.send(self)
|
||||
for r, response in sorted(responses, key=lambda r: str(r[0])):
|
||||
|
||||
@@ -461,6 +461,11 @@ class ItemCreateForm(I18nModelForm):
|
||||
)
|
||||
|
||||
if self.cleaned_data.get('copy_from'):
|
||||
for mv in self.cleaned_data['copy_from'].meta_values.all():
|
||||
mv.pk = None
|
||||
mv.item = instance
|
||||
mv.save(force_insert=True)
|
||||
|
||||
for question in self.cleaned_data['copy_from'].questions.all():
|
||||
question.items.add(instance)
|
||||
question.log_action('pretix.event.question.changed', user=self.user, data={
|
||||
|
||||
@@ -54,7 +54,7 @@ from pretix.base.email import get_available_placeholders
|
||||
from pretix.base.forms import I18nModelForm, PlaceholderValidator
|
||||
from pretix.base.forms.questions import WrappedPhoneNumberPrefixWidget
|
||||
from pretix.base.forms.widgets import (
|
||||
DatePickerWidget, SplitDateTimePickerWidget, format_placeholders_help_text,
|
||||
DatePickerWidget, SplitDateTimePickerWidget,
|
||||
)
|
||||
from pretix.base.models import (
|
||||
Invoice, InvoiceAddress, ItemAddOn, Order, OrderFee, OrderPosition,
|
||||
@@ -159,7 +159,7 @@ class ReactivateOrderForm(ForceQuotaConfirmationForm):
|
||||
pass
|
||||
|
||||
|
||||
class CancelForm(forms.Form):
|
||||
class CancelForm(ForceQuotaConfirmationForm):
|
||||
send_email = forms.BooleanField(
|
||||
required=False,
|
||||
label=_('Notify customer by email'),
|
||||
@@ -188,7 +188,6 @@ class CancelForm(forms.Form):
|
||||
)
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
self.instance = kwargs.pop("instance")
|
||||
super().__init__(*args, **kwargs)
|
||||
change_decimal_field(self.fields['cancellation_fee'], self.instance.event.currency)
|
||||
self.fields['cancellation_fee'].widget.attrs['placeholder'] = floatformat(
|
||||
@@ -206,20 +205,6 @@ class CancelForm(forms.Form):
|
||||
return val
|
||||
|
||||
|
||||
class DenyForm(forms.Form):
|
||||
send_email = forms.BooleanField(
|
||||
required=False,
|
||||
label=_('Notify customer by email'),
|
||||
initial=True
|
||||
)
|
||||
comment = forms.CharField(
|
||||
label=_('Comment (will be sent to the user)'),
|
||||
help_text=_('Will be included in the notification email when the respective placeholder is present in the '
|
||||
'configured email text.'),
|
||||
required=False,
|
||||
)
|
||||
|
||||
|
||||
class MarkPaidForm(ConfirmPaymentForm):
|
||||
send_email = forms.BooleanField(
|
||||
required=False,
|
||||
@@ -677,14 +662,19 @@ class OrderMailForm(forms.Form):
|
||||
)
|
||||
|
||||
def _set_field_placeholders(self, fn, base_parameters):
|
||||
placeholders = get_available_placeholders(self.order.event, base_parameters)
|
||||
ht = format_placeholders_help_text(placeholders, self.order.event)
|
||||
phs = [
|
||||
'{%s}' % p
|
||||
for p in sorted(get_available_placeholders(self.order.event, base_parameters).keys())
|
||||
]
|
||||
ht = _('Available placeholders: {list}').format(
|
||||
list=', '.join(phs)
|
||||
)
|
||||
if self.fields[fn].help_text:
|
||||
self.fields[fn].help_text += ' ' + str(ht)
|
||||
else:
|
||||
self.fields[fn].help_text = ht
|
||||
self.fields[fn].validators.append(
|
||||
PlaceholderValidator(['{%s}' % p for p in placeholders.keys()])
|
||||
PlaceholderValidator(phs)
|
||||
)
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
@@ -867,14 +857,19 @@ class EventCancelForm(forms.Form):
|
||||
send_waitinglist_message = forms.CharField()
|
||||
|
||||
def _set_field_placeholders(self, fn, base_parameters):
|
||||
placeholders = get_available_placeholders(self.event, base_parameters)
|
||||
ht = format_placeholders_help_text(placeholders, self.event)
|
||||
phs = [
|
||||
'{%s}' % p
|
||||
for p in sorted(get_available_placeholders(self.event, base_parameters).keys())
|
||||
]
|
||||
ht = _('Available placeholders: {list}').format(
|
||||
list=', '.join(phs)
|
||||
)
|
||||
if self.fields[fn].help_text:
|
||||
self.fields[fn].help_text += ' ' + str(ht)
|
||||
else:
|
||||
self.fields[fn].help_text = ht
|
||||
self.fields[fn].validators.append(
|
||||
PlaceholderValidator(['{%s}' % p for p in placeholders.keys()])
|
||||
PlaceholderValidator(phs)
|
||||
)
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
|
||||
@@ -63,9 +63,7 @@ from pretix.base.forms.questions import (
|
||||
NamePartsFormField, WrappedPhoneNumberPrefixWidget, get_country_by_locale,
|
||||
get_phone_prefix,
|
||||
)
|
||||
from pretix.base.forms.widgets import (
|
||||
SplitDateTimePickerWidget, format_placeholders_help_text,
|
||||
)
|
||||
from pretix.base.forms.widgets import SplitDateTimePickerWidget
|
||||
from pretix.base.models import (
|
||||
Customer, Device, EventMetaProperty, Gate, GiftCard, GiftCardAcceptance,
|
||||
Membership, MembershipType, OrderPosition, Organizer, ReusableMedium, Team,
|
||||
@@ -414,10 +412,6 @@ class OrganizerSettingsForm(SettingsForm):
|
||||
'reusable_media_type_nfc_uid',
|
||||
'reusable_media_type_nfc_uid_autocreate_giftcard',
|
||||
'reusable_media_type_nfc_uid_autocreate_giftcard_currency',
|
||||
'reusable_media_type_nfc_mf0aes',
|
||||
'reusable_media_type_nfc_mf0aes_autocreate_giftcard',
|
||||
'reusable_media_type_nfc_mf0aes_autocreate_giftcard_currency',
|
||||
'reusable_media_type_nfc_mf0aes_random_uid',
|
||||
]
|
||||
|
||||
organizer_logo_image = ExtFileField(
|
||||
@@ -570,14 +564,19 @@ class MailSettingsForm(SettingsForm):
|
||||
return placeholders
|
||||
|
||||
def _set_field_placeholders(self, fn, base_parameters):
|
||||
placeholders = self._get_sample_context(base_parameters)
|
||||
ht = format_placeholders_help_text(placeholders)
|
||||
phs = [
|
||||
'{%s}' % p
|
||||
for p in sorted(self._get_sample_context(base_parameters).keys())
|
||||
]
|
||||
ht = _('Available placeholders: {list}').format(
|
||||
list=', '.join(phs)
|
||||
)
|
||||
if self.fields[fn].help_text:
|
||||
self.fields[fn].help_text += ' ' + str(ht)
|
||||
else:
|
||||
self.fields[fn].help_text = ht
|
||||
self.fields[fn].validators.append(
|
||||
PlaceholderValidator(['{%s}' % p for p in placeholders.keys()])
|
||||
PlaceholderValidator(phs)
|
||||
)
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
|
||||
@@ -46,7 +46,6 @@ from django_scopes.forms import SafeModelChoiceField
|
||||
|
||||
from pretix.base.email import get_available_placeholders
|
||||
from pretix.base.forms import I18nModelForm, PlaceholderValidator
|
||||
from pretix.base.forms.widgets import format_placeholders_help_text
|
||||
from pretix.base.models import Item, Voucher
|
||||
from pretix.control.forms import SplitDateTimeField, SplitDateTimePickerWidget
|
||||
from pretix.control.forms.widgets import Select2, Select2ItemVarQuota
|
||||
@@ -290,15 +289,19 @@ class VoucherBulkForm(VoucherForm):
|
||||
Recipient = namedtuple('Recipient', 'email number name tag')
|
||||
|
||||
def _set_field_placeholders(self, fn, base_parameters):
|
||||
placeholders = get_available_placeholders(self.instance.event, base_parameters)
|
||||
ht = format_placeholders_help_text(placeholders, self.instance.event)
|
||||
|
||||
phs = [
|
||||
'{%s}' % p
|
||||
for p in sorted(get_available_placeholders(self.instance.event, base_parameters).keys())
|
||||
]
|
||||
ht = _('Available placeholders: {list}').format(
|
||||
list=', '.join(phs)
|
||||
)
|
||||
if self.fields[fn].help_text:
|
||||
self.fields[fn].help_text += ' ' + str(ht)
|
||||
else:
|
||||
self.fields[fn].help_text = ht
|
||||
self.fields[fn].validators.append(
|
||||
PlaceholderValidator(['{%s}' % p for p in placeholders.keys()])
|
||||
PlaceholderValidator(phs)
|
||||
)
|
||||
|
||||
class Meta:
|
||||
@@ -337,9 +340,6 @@ class VoucherBulkForm(VoucherForm):
|
||||
|
||||
def clean_send_recipients(self):
|
||||
raw = self.cleaned_data['send_recipients']
|
||||
if self.cleaned_data.get('send', None) is False:
|
||||
# No need to validate addresses if the section was turned off
|
||||
return []
|
||||
if not raw:
|
||||
return []
|
||||
r = raw.split('\n')
|
||||
|
||||
@@ -341,7 +341,6 @@ def pretixcontrol_logentry_display(sender: Event, logentry: LogEntry, **kwargs):
|
||||
'pretix.giftcards.acceptance.added': _('Gift card acceptance for another organizer has been added.'),
|
||||
'pretix.giftcards.acceptance.removed': _('Gift card acceptance for another organizer has been removed.'),
|
||||
'pretix.giftcards.acceptance.acceptor.invited': _('A new gift card acceptor has been invited.'),
|
||||
'pretix.giftcards.acceptance.acceptor.removed': _('A gift card acceptor has been removed.'),
|
||||
'pretix.giftcards.acceptance.issuer.removed': _('A gift card issuer has been removed or declined.'),
|
||||
'pretix.giftcards.acceptance.issuer.accepted': _('A new gift card issuer has been accepted.'),
|
||||
'pretix.webhook.created': _('The webhook has been created.'),
|
||||
@@ -369,7 +368,6 @@ def pretixcontrol_logentry_display(sender: Event, logentry: LogEntry, **kwargs):
|
||||
'pretix.reusable_medium.created.auto': _('The reusable medium has been created automatically.'),
|
||||
'pretix.reusable_medium.changed': _('The reusable medium has been changed.'),
|
||||
'pretix.reusable_medium.linked_orderposition.changed': _('The medium has been connected to a new ticket.'),
|
||||
'pretix.reusable_medium.linked_giftcard.changed': _('The medium has been connected to a new gift card.'),
|
||||
'pretix.email.error': _('Sending of an email has failed.'),
|
||||
'pretix.event.comment': _('The event\'s internal comment has been updated.'),
|
||||
'pretix.event.canceled': _('The event has been canceled.'),
|
||||
@@ -434,7 +432,6 @@ def pretixcontrol_logentry_display(sender: Event, logentry: LogEntry, **kwargs):
|
||||
'the order has been received and requires '
|
||||
'approval.'),
|
||||
'pretix.event.order.email.resend': _('An email with a link to the order detail page has been resent to the user.'),
|
||||
'pretix.event.order.email.payment_failed': _('An email has been sent to notify the user that the payment failed.'),
|
||||
'pretix.event.order.payment.confirmed': _('Payment {local_id} has been confirmed.'),
|
||||
'pretix.event.order.payment.canceled': _('Payment {local_id} has been canceled.'),
|
||||
'pretix.event.order.payment.canceled.failed': _('Canceling payment {local_id} has failed.'),
|
||||
|
||||
@@ -304,10 +304,6 @@ an instance of a form class that you bind yourself when appropriate. Your form w
|
||||
as part of the standard validation and rendering cycle and rendered using default bootstrap
|
||||
styles. It is advisable to set a prefix for your form to avoid clashes with other plugins.
|
||||
|
||||
Your forms may also have two special properties: ``template`` with a template that will be
|
||||
included to render the form, and ``title``, which will be used as a headline. Your template
|
||||
will be passed a ``form`` variable with your form.
|
||||
|
||||
As with all plugin signals, the ``sender`` keyword argument will contain the event.
|
||||
"""
|
||||
|
||||
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user