diff --git a/.forgejo/workflows/deploy_staging.yml b/.forgejo/workflows/deploy_staging.yml index 3b44d24..4c5fcd0 100644 --- a/.forgejo/workflows/deploy_staging.yml +++ b/.forgejo/workflows/deploy_staging.yml @@ -35,7 +35,7 @@ jobs: - name: Populate relevant files run: | - mkdir ~/.ssh + mkdir -p ~/.ssh echo "${{ secrets.C3LF_SSH_TESTING }}" > ~/.ssh/id_ed25519 chmod 0600 ~/.ssh/id_ed25519 ls -lah ~/.ssh @@ -43,7 +43,7 @@ jobs: eval $(ssh-agent -s) ssh-add ~/.ssh/id_ed25519 echo "andromeda.lab.or.it ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDXPoO0PE+B9PYwbGaLo98zhbmjAkp6eBtVeZe43v/+T" >> ~/.ssh/known_hosts - mkdir /etc/ansible + mkdir -p /etc/ansible echo "${{ secrets.C3LF_INVENTORY_TESTING }}" > /etc/ansible/hosts - name: Check ansible version diff --git a/README.md b/README.md new file mode 100644 index 0000000..3581cac --- /dev/null +++ b/README.md @@ -0,0 +1,158 @@ +# C3LF System3 + +the third try to automate lost&found organization for chaos events. not a complete rewrite, but instead building on top +of the web frontend of version 2. everything else is new but still API compatible. now with more monorepo. + +## Architecture + +C3LF System3 integrates a Django-Rest-Framework + WebSocket backend, Vue.js frontend SPA and a minimal LMTP mail server +integrated with the Django backend. It is additionally deployed with a Postfix mail server as Proxy in front of the +LMTP socket, a MariaDB database, a Redis cache and an Nginx reverse proxy that serves the static SPA frontend, proxies +the API requests to the backend and serves the media files in cooperation with the Django backend using the +`X-Accel-Redirect` header. + +The production deployment is automated using Ansible and there are some Docker Compose configurations for development. + +## Project Structure + +- `core/` Contains the Django backend with database models, API endpoints, migrations, API tests, and mail server + functionalities. +- `web/` Contains the Vue.js frontend application. +- `deploy/` Contains deployment configurations and Docker scripts for various development modes. + +For more information, see the README.md files in the respective directories. + +## Development Modes + +There are currently 4 development modes for this Project: + +- Frontend-Only +- Backend-API-Only +- Full-Stack-Lite 'dev' (docker) +- **[WIP]** Full-Stack 'testing' (docker) + +*Choose the one that is most suited to the feature you want to work on or ist easiest for you to set up ;)* + +For all modes it is assumed that you have `git` installed, have cloned the repository and are in the root directory of +the project. Use `git clone https://git.hannover.ccc.de/c3lf/c3lf-system-3.git` to get the official upstream repository. +The required packages for each mode are listed separately and also state the specific package name for Debian 12. + +### Frontend-Only + +This mode is for developing the frontend only. It uses the vue-cli-service (webpack) to serve the frontend and watches +for changes in the source code to provide hot reloading. The API requests are proxied to the staging backend. + +#### Requirements + +* Node.js (~20.19.0) (`nodejs`) +* npm (~9.2.0) (`npm`) + +*Note: The versions are not strict, but these are tested. Other versions might work as well.* + +#### Steps + +```bash +cd web +npm intall +npm run serve +``` + +Now you can access the frontend at `localhost:8080` and start editing the code in the `web` directory. +For more information, see the README.md file in the `web` directory. + +### Backend-API-Only + +This mode is for developing the backend API only. It also specifically excludes most WebSockets and mail server +functionalities. Use this mode to focus on the backend API and Database models. + +#### Requirements + +* Python (~3.11) (`python3`) +* pip (`python3-pip`) +* virtualenv (`python3-venv`) + +*Note: The versions are not strict, but these are tested. Other versions might work as well.* + +#### Steps + +``` +python -m venv venv +source venv/bin/activate +pip install -r core/requirements.dev.txt +cd core +python manage.py test +``` + +The tests should run successfully to start and you can now start the TDD workflow by adding new failing tests. +For more information about the backend and TDD, see the README.md file in the `core` directory. + +### Full-Stack-Lite 'dev' (docker) + +This mode is for developing the both frontend and backend backend at the same time in a containerized environment. It +uses the `docker-compose` command to build and run the application in a container. It specifically excludes all mail +server and most WebSocket functionalities. + +#### Requirements + +* Docker (`docker.io`) +* Docker Compose (`docker-compose`) + +*Note: Depending on your system, the `docker compose` command might be included in general `docker` or `docker-ce` +package, or you might want to use podman instead.* + +#### Steps + +```bash +docker-compose -f deploy/dev/docker-compose.yml up --build +``` + +The page should be available at [localhost:8080](http://localhost:8080) +This Mode provides a minimal set of testdata, including a user `testuser` with password `testuser`. The test dataset is +defined in deploy/testdata.py and can be extended there. + +You can now edit code in `/web` and `/core` and changes will be applied to the running page as soon as the file is +saved. + +For details about each part, read `/web/README.md` and `/core/README.md` respectively. To execute commands in the +container context use 'exec' like + +```bash +docker exec -it c3lf-sys3-dev-core-1 ./manage.py test` +``` + +### Full-Stack 'testing' (docker) + +**WORK IN PROGRESS** + +*will include postfix, mariadb, redis, nginx and the ability to test sending mails, receiving mail and websocket based +realiteme updates in the frontend. the last step in verification before deploying to the staging system using ansible* + +## Online Instances + +These are deployed using `deploy/ansible/playbooks/deploy-c3lf-sys3.yml` and follow a specific git branch. + +### 'live' + +| URL | [c3lf.de](https://c3lf.de) | +|----------------|----------------------------| +| **Branch** | live | +| **Host** | polaris.lab.or.it | +| **Debug Mode** | off | + +This is the **'production' system** and should strictly follow the staging system after all changes have been validated. + +### 'staging' + +| URL | [staging.c3lf.de](https://staging.c3lf.de) | +|----------------|--------------------------------------------| +| **Branch** | testing | +| **Host** | andromeda.lab.or.it | +| **Debug Mode** | on | + +This system ist automatically updated by [git.hannover.ccc.de](https://git.hannover.ccc.de/c3lf/c3lf-system-3/) whenever +a commit is pushed to the 'testing' branch and the backend tests passed. + +**WARNING: allthough this is the staging system, it is fully functional and contains a copy of the 'production' data, so +do not for example reply to tickets for testing purposes as the system WILL SEND AN EMAIL to the person who originally +created it. If you want to test something like that, first create you own test ticket by sending an email to +`@staging.c3lf.de`** \ No newline at end of file diff --git a/core/testdata.py b/core/.local/.forgit_fordocker similarity index 100% rename from core/testdata.py rename to core/.local/.forgit_fordocker diff --git a/core/README.md b/core/README.md new file mode 100644 index 0000000..f9780e0 --- /dev/null +++ b/core/README.md @@ -0,0 +1,68 @@ +# Core + +This directory contains the backend of the C3LF System3 project, which is built using Django and Django Rest Framework. + +## Modules + +- `authentication`: Handles user authentication and authorization. +- `files`: Manages file uploads and related operations. +- `inventory`: Handles inventory management, including events, containers and items. +- `mail`: Manages email-related functionalities, including sending and receiving emails. +- `notify_sessions`: Handles real-time notifications and WebSocket sessions. +- `tickets`: Manages the ticketing system for issue tracking. + +## Modules Structure + +Most modules follow a similar structure, including the following components: + +- `/models.py`: Contains the database models for the module. +- `/serializers.py`: Contains the serializers for the module models. +- `/api_.py`: Contains the API views and endpoints for the module. +- `/migrations/`: Contains database migration files. Needs to contain an `__init__.py` file to be recognized as + a Python package and automatically migration creation to work. +- `/tests//test_.py`: Contains the test cases for the module. + +## Development Setup + +follow the instructions under 'Backend-API-Only' or 'Fullstack-Lite' in the root level `README.md` to set up a +development environment. + +## Test-Driven Development (TDD) Workflow + +The project follows a TDD workflow to ensure code quality and reliability. Here is a step-by-step guide to the TDD +process: + +1. **Write a Test**: Start by writing a test case for the new feature or bug fix. Place the test case in the appropriate + module within the `/tests//test_.py` file. + +2. **Run the Test**: Execute the test to ensure it fails, confirming that the feature is not yet implemented or the bug + exists. + ```bash + python manage.py test + ``` + +3. **Write the Code**: Implement the code required to pass the test. Write the code in the appropriate module within the + project. + +4. **Run the Test Again**: Execute the test again to ensure it passes. + ```bash + python manage.py test + ``` + +5. **Refactor**: Refactor the code to improve its structure and readability while ensuring that all tests still pass. + +6. **Repeat**: Repeat the process for each new feature or bug fix. + +## Measuring Test Coverage + +The project uses the `coverage` package to measure test coverage. To generate a coverage report, run the following +command: + +```bash +coverage run --source='.' manage.py test +coverage report +``` + +## Additional Information + +For more detailed information on the project structure and development modes, refer to the root level `README.md`. \ No newline at end of file diff --git a/core/core/metrics.py b/core/core/metrics.py new file mode 100644 index 0000000..d973b0d --- /dev/null +++ b/core/core/metrics.py @@ -0,0 +1,40 @@ +from django.apps import apps +from prometheus_client.core import CounterMetricFamily, REGISTRY +from django.db.models import Case, Value, When, BooleanField, Count +from inventory.models import Item + + +class ItemCountCollector(object): + + def collect(self): + try: + counter = CounterMetricFamily("item_count", "Current number of items", labels=['event', 'returned_state']) + + yield counter + + if not apps.models_ready or not apps.apps_ready: + return + + queryset = ( + Item.all_objects + .annotate( + returned=Case( + When(returned_at__isnull=True, then=Value(False)), + default=Value(True), + output_field=BooleanField() + ) + ) + .values('event__slug', 'returned', 'event_id') + .annotate(amount=Count('id')) + .order_by('event__slug', 'returned') # Optional: order by slug and returned + ) + + for e in queryset: + counter.add_metric([e["event__slug"].lower(), str(e["returned"])], e["amount"]) + + yield counter + except: + pass + + +REGISTRY.register(ItemCountCollector()) diff --git a/core/core/urls.py b/core/core/urls.py index 1c5f158..2386891 100644 --- a/core/core/urls.py +++ b/core/core/urls.py @@ -19,6 +19,8 @@ from django.urls import path, include from .version import get_info +from .metrics import * + urlpatterns = [ path('djangoadmin/', admin.site.urls), path('api/2/', include('inventory.api_v2')), diff --git a/core/files/models.py b/core/files/models.py index 33a6265..a8eb775 100644 --- a/core/files/models.py +++ b/core/files/models.py @@ -1,6 +1,5 @@ from django.core.files.base import ContentFile from django.db import models, IntegrityError -from django_softdelete.models import SoftDeleteModel from inventory.models import Item @@ -10,7 +9,8 @@ def hash_upload(instance, filename): class FileManager(models.Manager): - def get_or_create(self, **kwargs): + + def __file_data_helper(self, **kwargs): if 'data' in kwargs and type(kwargs['data']) == str: import base64 from hashlib import sha256 @@ -31,6 +31,10 @@ class FileManager(models.Manager): pass else: raise ValueError('data must be a base64 encoded string or file and hash must be provided') + return kwargs + + def get_or_create(self, **kwargs): + kwargs = self.__file_data_helper(**kwargs) try: return self.get(hash=kwargs['hash']), False except self.model.DoesNotExist: @@ -39,26 +43,7 @@ class FileManager(models.Manager): return obj, True def create(self, **kwargs): - if 'data' in kwargs and type(kwargs['data']) == str: - import base64 - from hashlib import sha256 - raw = kwargs['data'] - if not raw.startswith('data:'): - raise ValueError('data must be a base64 encoded string or file and hash must be provided') - raw = raw.split(';base64,') - if len(raw) != 2: - raise ValueError('data must be a base64 encoded string or file and hash must be provided') - mime_type = raw[0].split(':')[1] - content = base64.b64decode(raw[1], validate=True) - kwargs.pop('data') - content_hash = sha256(content).hexdigest() - kwargs['file'] = ContentFile(content, content_hash) - kwargs['hash'] = content_hash - kwargs['mime_type'] = mime_type - elif 'file' in kwargs and 'hash' in kwargs and type(kwargs['file']) == ContentFile and 'mime_type' in kwargs: - pass - else: - raise ValueError('data must be a base64 encoded string or file and hash must be provided') + kwargs = self.__file_data_helper(**kwargs) if not self.filter(hash=kwargs['hash']).exists(): obj = super().create(**kwargs) obj.file.save(content=kwargs['file'], name=kwargs['hash']) diff --git a/core/inventory/api_v2.py b/core/inventory/api_v2.py index 326c049..04c1722 100644 --- a/core/inventory/api_v2.py +++ b/core/inventory/api_v2.py @@ -39,13 +39,61 @@ class ItemViewSet(viewsets.ModelViewSet): def filter_items(items, query): query_tokens = query.split(' ') + matches = [] for item in items: value = 0 + if "I#" + str(item.id) in query: + value += 12 + matches.append( + {'type': 'item_id', 'text': f'is exactly {item.id} and matched "I#{item.id}"'}) + elif "#" + str(item.id) in query: + value += 11 + matches.append( + {'type': 'item_id', 'text': f'is exactly {item.id} and matched "#{item.id}"'}) + elif str(item.id) in query: + value += 10 + matches.append({'type': 'item_id', 'text': f'is exactly {item.id}'}) + for issue in item.related_issues: + if "T#" + issue.short_uuid() in query: + value += 8 + matches.append({'type': 'ticket_uuid', + 'text': f'is exactly {issue.short_uuid()} and matched "T#{issue.short_uuid()}"'}) + elif "#" + issue.short_uuid() in query: + value += 5 + matches.append({'type': 'ticket_uuid', + 'text': f'is exactly {issue.short_uuid()} and matched "#{issue.short_uuid()}"'}) + elif issue.short_uuid() in query: + value += 3 + matches.append({'type': 'ticket_uuid', 'text': f'is exactly {issue.short_uuid()}'}) + if "T#" + str(issue.id) in query: + value += 8 + matches.append({'type': 'ticket_id', 'text': f'is exactly {issue.id} and matched "T#{issue.id}"'}) + elif "#" + str(issue.id) in query: + value += 5 + matches.append({'type': 'ticket_id', 'text': f'is exactly {issue.id} and matched "#{issue.id}"'}) + elif str(issue.id) in query: + value += 3 + matches.append({'type': 'ticket_id', 'text': f'is exactly {issue.id}'}) + for comment in issue.comments.all(): + for token in query_tokens: + if token in comment.comment: + value += 1 + matches.append({'type': 'ticket_comment', 'text': f'contains {token}'}) + for token in query_tokens: + if token in issue.name: + value += 1 + matches.append({'type': 'ticket_name', 'text': f'contains {token}'}) for token in query_tokens: if token in item.description: value += 1 + matches.append({'type': 'item_description', 'text': f'contains {token}'}) + for comment in item.comments.all(): + for token in query_tokens: + if token in comment.comment: + value += 1 + matches.append({'type': 'comment', 'text': f'contains {token}'}) if value > 0: - yield {'search_score': value, 'item': item} + yield {'search_score': value, 'item': item, 'search_matches': matches} @api_view(['GET']) diff --git a/core/inventory/migrations/0008_tag_item_tags.py b/core/inventory/migrations/0008_tag_item_tags.py deleted file mode 100644 index 4322407..0000000 --- a/core/inventory/migrations/0008_tag_item_tags.py +++ /dev/null @@ -1,25 +0,0 @@ -# Generated by Django 4.2.7 on 2025-01-29 20:56 - -from django.db import migrations, models - - -class Migration(migrations.Migration): - - dependencies = [ - ('inventory', '0007_rename_container_item_container_old_itemplacement_and_more'), - ] - - operations = [ - migrations.CreateModel( - name='Tag', - fields=[ - ('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')), - ('slug', models.TextField()), - ], - ), - migrations.AddField( - model_name='item', - name='tags', - field=models.ManyToManyField(to='inventory.tag'), - ), - ] diff --git a/core/inventory/models.py b/core/inventory/models.py index 23f9c27..c782bcb 100644 --- a/core/inventory/models.py +++ b/core/inventory/models.py @@ -1,6 +1,9 @@ from itertools import groupby from django.db import models +from django.db.models.signals import pre_save +from django.dispatch import receiver +from django.utils import timezone from django_softdelete.models import SoftDeleteModel, SoftDeleteManager @@ -28,7 +31,6 @@ class Item(SoftDeleteModel): returned_at = models.DateTimeField(blank=True, null=True) created_at = models.DateTimeField(null=True, auto_now_add=True) updated_at = models.DateTimeField(blank=True, null=True) - tags = models.ManyToManyField('Tag') @property def container(self): @@ -65,6 +67,11 @@ class Item(SoftDeleteModel): return '[' + str(self.id) + ']' + self.description +@receiver(pre_save, sender=Item) +def item_updated(sender, instance, **kwargs): + instance.updated_at = timezone.now() + + class Container(SoftDeleteModel): id = models.AutoField(primary_key=True) name = models.CharField(max_length=255) @@ -99,12 +106,6 @@ class Comment(models.Model): def __str__(self): return str(self.item) + ' comment #' + str(self.id) -class Tag(models.Model): - slug = models.TextField() - - def __str__(self): - return self.slug - class Event(models.Model): id = models.AutoField(primary_key=True) diff --git a/core/inventory/serializers.py b/core/inventory/serializers.py index 26a5be4..0661476 100644 --- a/core/inventory/serializers.py +++ b/core/inventory/serializers.py @@ -132,15 +132,33 @@ class ItemSerializer(BasicItemSerializer): 'cid': placement.container.id, 'box': placement.container.name }) + + if obj.created_at: + timeline.append({ + 'type': 'created', + 'timestamp': obj.created_at, + }) + if obj.returned_at: + timeline.append({ + 'type': 'returned', + 'timestamp': obj.returned_at, + }) + if obj.deleted_at: + timeline.append({ + 'type': 'deleted', + 'timestamp': obj.deleted_at, + }) return sorted(timeline, key=lambda x: x['timestamp']) class SearchResultSerializer(serializers.Serializer): search_score = serializers.IntegerField() + search_matches = serializers.ListField(child=serializers.DictField()) item = ItemSerializer() def to_representation(self, instance): - return {**ItemSerializer(instance['item']).data, 'search_score': instance['search_score']} + return {**ItemSerializer(instance['item']).data, 'search_score': instance['search_score'], + 'search_matches': instance['search_matches']} class Meta: model = Item diff --git a/core/inventory/tests/v2/test_items.py b/core/inventory/tests/v2/test_items.py index 0c85eb4..34c4739 100644 --- a/core/inventory/tests/v2/test_items.py +++ b/core/inventory/tests/v2/test_items.py @@ -63,28 +63,28 @@ class ItemTestCase(TestCase): self.assertEqual(response.json()[0]['file'], None) self.assertEqual(response.json()[0]['returned'], False) self.assertEqual(response.json()[0]['event'], self.event.slug) - self.assertEqual(len(response.json()[0]['timeline']), 4) - self.assertEqual(response.json()[0]['timeline'][0]['type'], 'placement') - self.assertEqual(response.json()[0]['timeline'][1]['type'], 'comment') - self.assertEqual(response.json()[0]['timeline'][2]['type'], 'issue_relation') - self.assertEqual(response.json()[0]['timeline'][3]['type'], 'placement') - self.assertEqual(response.json()[0]['timeline'][1]['id'], comment.id) - self.assertEqual(response.json()[0]['timeline'][2]['id'], match.id) - self.assertEqual(response.json()[0]['timeline'][3]['id'], placement.id) - self.assertEqual(response.json()[0]['timeline'][0]['box'], 'BOX1') - self.assertEqual(response.json()[0]['timeline'][0]['cid'], self.box1.id) - self.assertEqual(response.json()[0]['timeline'][1]['comment'], 'test') - self.assertEqual(response.json()[0]['timeline'][1]['timestamp'], - comment.timestamp.strftime('%Y-%m-%dT%H:%M:%S.%fZ')) - self.assertEqual(response.json()[0]['timeline'][2]['status'], 'possible') - self.assertEqual(response.json()[0]['timeline'][2]['timestamp'], - match.timestamp.strftime('%Y-%m-%dT%H:%M:%S.%fZ')) - self.assertEqual(response.json()[0]['timeline'][2]['issue_thread']['name'], "test issue") - self.assertEqual(response.json()[0]['timeline'][2]['issue_thread']['event'], "EVENT") - self.assertEqual(response.json()[0]['timeline'][2]['issue_thread']['state'], "pending_new") - self.assertEqual(response.json()[0]['timeline'][3]['box'], 'BOX2') - self.assertEqual(response.json()[0]['timeline'][3]['cid'], self.box2.id) - self.assertEqual(response.json()[0]['timeline'][3]['timestamp'], + self.assertEqual(len(response.json()[0]['timeline']), 5) + self.assertEqual(response.json()[0]['timeline'][0]['type'], 'created') + self.assertEqual(response.json()[0]['timeline'][1]['type'], 'placement') + self.assertEqual(response.json()[0]['timeline'][2]['type'], 'comment') + self.assertEqual(response.json()[0]['timeline'][3]['type'], 'issue_relation') + self.assertEqual(response.json()[0]['timeline'][4]['type'], 'placement') + self.assertEqual(response.json()[0]['timeline'][2]['id'], comment.id) + self.assertEqual(response.json()[0]['timeline'][3]['id'], match.id) + self.assertEqual(response.json()[0]['timeline'][4]['id'], placement.id) + self.assertEqual(response.json()[0]['timeline'][1]['box'], 'BOX1') + self.assertEqual(response.json()[0]['timeline'][1]['cid'], self.box1.id) + self.assertEqual(response.json()[0]['timeline'][0]['timestamp'], item.created_at.strftime('%Y-%m-%dT%H:%M:%S.%fZ')) + self.assertEqual(response.json()[0]['timeline'][2]['comment'], 'test') + self.assertEqual(response.json()[0]['timeline'][2]['timestamp'], comment.timestamp.strftime('%Y-%m-%dT%H:%M:%S.%fZ')) + self.assertEqual(response.json()[0]['timeline'][3]['status'], 'possible') + self.assertEqual(response.json()[0]['timeline'][3]['timestamp'], match.timestamp.strftime('%Y-%m-%dT%H:%M:%S.%fZ')) + self.assertEqual(response.json()[0]['timeline'][3]['issue_thread']['name'], "test issue") + self.assertEqual(response.json()[0]['timeline'][3]['issue_thread']['event'], "EVENT") + self.assertEqual(response.json()[0]['timeline'][3]['issue_thread']['state'], "pending_new") + self.assertEqual(response.json()[0]['timeline'][4]['box'], 'BOX2') + self.assertEqual(response.json()[0]['timeline'][4]['cid'], self.box2.id) + self.assertEqual(response.json()[0]['timeline'][4]['timestamp'], placement.timestamp.strftime('%Y-%m-%dT%H:%M:%S.%fZ')) self.assertEqual(len(response.json()[0]['related_issues']), 1) self.assertEqual(response.json()[0]['related_issues'][0]['name'], "test issue") diff --git a/core/mail/protocol.py b/core/mail/protocol.py index 63cf6dd..23ae696 100644 --- a/core/mail/protocol.py +++ b/core/mail/protocol.py @@ -48,9 +48,21 @@ def unescape_and_decode_base64(s): return decoded -def unescape_simplified_quoted_printable(s): +def unescape_simplified_quoted_printable(s, encoding='utf-8'): import quopri - return quopri.decodestring(s).decode('utf-8') + return quopri.decodestring(s).decode(encoding) + + +def decode_inline_encodings(s): + s = unescape_and_decode_quoted_printable(s) + s = unescape_and_decode_base64(s) + return s + + +def ascii_strip(s): + if not s: + return None + return ''.join([c for c in str(s) if 128 > ord(c) > 31]) def collect_references(issue_thread): @@ -91,7 +103,7 @@ async def send_smtp(message): await aiosmtplib.send(message, hostname="127.0.0.1", port=25, use_tls=False, start_tls=False) -def find_active_issue_thread(in_reply_to, address, subject, event): +def find_active_issue_thread(in_reply_to, address, subject, event, spam=False): from re import match uuid_match = match(r'^ticket\+([a-f0-9-]{36})@', address) if uuid_match: @@ -102,7 +114,8 @@ def find_active_issue_thread(in_reply_to, address, subject, event): if reply_to.exists(): return reply_to.first().issue_thread, False else: - issue = IssueThread.objects.create(name=subject, event=event) + issue = IssueThread.objects.create(name=subject, event=event, + initial_state='pending_suspected_spam' if spam else 'pending_new') return issue, True @@ -116,6 +129,22 @@ def find_target_event(address): return None +def decode_email_segment(segment, charset, transfer_encoding): + decode_as = 'utf-8' + if charset == 'windows-1251': + decode_as = 'cp1251' + elif charset == 'iso-8859-1': + decode_as = 'latin1' + if transfer_encoding == 'quoted-printable': + segment = unescape_simplified_quoted_printable(segment, decode_as) + elif transfer_encoding == 'base64': + import base64 + segment = base64.b64decode(segment).decode('utf-8') + else: + segment = decode_inline_encodings(segment.decode('utf-8')) + return segment + + def parse_email_body(raw, log=None): import email from hashlib import sha256 @@ -127,21 +156,24 @@ def parse_email_body(raw, log=None): if parsed.is_multipart(): for part in parsed.walk(): ctype = part.get_content_type() + charset = part.get_content_charset() cdispo = str(part.get('Content-Disposition')) + if ctype == 'multipart/mixed': + log.debug("Ignoring Multipart %s %s", ctype, cdispo) # skip any text/plain (txt) attachments - if ctype == 'text/plain' and 'attachment' not in cdispo: + elif ctype == 'text/plain' and 'attachment' not in cdispo: segment = part.get_payload() if not segment: continue - segment = unescape_and_decode_quoted_printable(segment) - segment = unescape_and_decode_base64(segment) - if part.get('Content-Transfer-Encoding') == 'quoted-printable': - segment = unescape_simplified_quoted_printable(segment) + segment = decode_email_segment(segment.encode('utf-8'), charset, part.get('Content-Transfer-Encoding')) log.debug(segment) body = body + segment elif 'attachment' in cdispo or 'inline' in cdispo: - file = ContentFile(part.get_payload(decode=True)) + content = part.get_payload(decode=True) + if content is None: + continue + file = ContentFile(content) chash = sha256(file.read()).hexdigest() name = part.get_filename() if name is None: @@ -167,10 +199,8 @@ def parse_email_body(raw, log=None): else: log.warning("Unknown content type %s", parsed.get_content_type()) body = "Unknown content type" - body = unescape_and_decode_quoted_printable(body) - body = unescape_and_decode_base64(body) - if parsed.get('Content-Transfer-Encoding') == 'quoted-printable': - body = unescape_simplified_quoted_printable(body) + body = decode_email_segment(body.encode('utf-8'), parsed.get_content_charset(), + parsed.get('Content-Transfer-Encoding')) log.debug(body) return parsed, body, attachments @@ -182,8 +212,10 @@ def receive_email(envelope, log=None): header_from = parsed.get('From') header_to = parsed.get('To') - header_in_reply_to = parsed.get('In-Reply-To') - header_message_id = parsed.get('Message-ID') + header_in_reply_to = ascii_strip(parsed.get('In-Reply-To')) + header_message_id = ascii_strip(parsed.get('Message-ID')) + maybe_spam = parsed.get('X-Spam') + suspected_spam = (maybe_spam and maybe_spam.lower() == 'yes') if match(r'^([a-zA-Z ]*<)?MAILER-DAEMON@', header_from) and envelope.mail_from.strip("<>") == "": log.warning("Ignoring mailer daemon") @@ -191,25 +223,28 @@ def receive_email(envelope, log=None): if Email.objects.filter(reference=header_message_id).exists(): # break before issue thread is created log.warning("Email already exists") - raise Exception("Email already exists") + raise SpecialMailException("Email already exists") recipient = envelope.rcpt_tos[0].lower() if envelope.rcpt_tos else header_to.lower() sender = envelope.mail_from if envelope.mail_from else header_from - subject = parsed.get('Subject') + subject = ascii_strip(parsed.get('Subject')) if not subject: subject = "No subject" - subject = unescape_and_decode_quoted_printable(subject) - subject = unescape_and_decode_base64(subject) + subject = decode_inline_encodings(subject) + recipient = decode_inline_encodings(recipient) + sender = decode_inline_encodings(sender) target_event = find_target_event(recipient) - active_issue_thread, new = find_active_issue_thread(header_in_reply_to, recipient, subject, target_event) + active_issue_thread, new = find_active_issue_thread( + header_in_reply_to, recipient, subject, target_event, suspected_spam) from hashlib import sha256 random_filename = 'mail-' + sha256(envelope.content).hexdigest() email = Email.objects.create( sender=sender, recipient=recipient, body=body, subject=subject, reference=header_message_id, - in_reply_to=header_in_reply_to, raw_file=ContentFile(envelope.content, name=random_filename), event=target_event, + in_reply_to=header_in_reply_to, raw_file=ContentFile(envelope.content, name=random_filename), + event=target_event, issue_thread=active_issue_thread) for attachment in attachments: email.attachments.add(attachment) @@ -219,7 +254,7 @@ def receive_email(envelope, log=None): if new: # auto reply if new issue references = collect_references(active_issue_thread) - if not sender.startswith('noreply'): + if not sender.startswith('noreply') and not sender.startswith('no-reply') and not suspected_spam: subject = f"Re: {subject} [#{active_issue_thread.short_uuid()}]" body = '''Your request (#{}) has been received and will be reviewed by our lost&found angels. diff --git a/core/mail/tests/v2/test_mails.py b/core/mail/tests/v2/test_mails.py index 3b358ca..95d35cb 100644 --- a/core/mail/tests/v2/test_mails.py +++ b/core/mail/tests/v2/test_mails.py @@ -142,7 +142,7 @@ class LMTPHandlerTestCase(TestCase): # TODO replace with less hacky test aiosmtplib.send.assert_called_once() self.assertEqual('test ä', Email.objects.all()[0].subject) self.assertEqual('Text mit Quoted-Printable-Kodierung: äöüß', Email.objects.all()[0].body) - self.assertTrue( Email.objects.all()[0].raw_file.path) + self.assertTrue(Email.objects.all()[0].raw_file.path) def test_handle_quoted_printable_2(self): from aiosmtpd.smtp import Envelope @@ -163,9 +163,9 @@ class LMTPHandlerTestCase(TestCase): # TODO replace with less hacky test aiosmtplib.send.assert_called_once() self.assertEqual('suche_Mütze', Email.objects.all()[0].subject) self.assertEqual('Text mit Quoted-Printable-Kodierung: äöüß', Email.objects.all()[0].body) - self.assertTrue( Email.objects.all()[0].raw_file.path) + self.assertTrue(Email.objects.all()[0].raw_file.path) - def test_handle_base64(self): + def test_handle_base64_inline(self): from aiosmtpd.smtp import Envelope from asgiref.sync import async_to_sync import aiosmtplib @@ -184,7 +184,36 @@ class LMTPHandlerTestCase(TestCase): # TODO replace with less hacky test aiosmtplib.send.assert_called_once() self.assertEqual('test', Email.objects.all()[0].subject) self.assertEqual('Text mit Base64-Kodierung: äöüß', Email.objects.all()[0].body) - self.assertTrue( Email.objects.all()[0].raw_file.path) + self.assertTrue(Email.objects.all()[0].raw_file.path) + + def test_handle_base64_transfer_encoding(self): + from aiosmtpd.smtp import Envelope + from asgiref.sync import async_to_sync + import aiosmtplib + aiosmtplib.send = make_mocked_coro() + handler = LMTPHandler() + server = mock.Mock() + session = mock.Mock() + envelope = Envelope() + envelope.mail_from = 'test1@test' + envelope.rcpt_tos = ['test2@test'] + envelope.content = b'''Subject: test +From: test3@test +To: test4@test +Message-ID: <1@test> +Content-Type: text/plain; charset=utf-8 +Content-Transfer-Encoding: base64 + +VGVzdCBtaXQgQmFzZTY0LUtvZGllcnVuZzogw6TDtsO8w58=''' + + result = async_to_sync(handler.handle_DATA)(server, session, envelope) + self.assertEqual(result, '250 Message accepted for delivery') + self.assertEqual(len(Email.objects.all()), 2) + self.assertEqual(len(IssueThread.objects.all()), 1) + aiosmtplib.send.assert_called_once() + self.assertEqual('test', Email.objects.all()[0].subject) + self.assertEqual('Test mit Base64-Kodierung: äöüß', Email.objects.all()[0].body) + self.assertTrue(Email.objects.all()[0].raw_file.path) def test_handle_client_reply(self): issue_thread = IssueThread.objects.create( @@ -232,7 +261,7 @@ class LMTPHandlerTestCase(TestCase): # TODO replace with less hacky test self.assertEqual(IssueThread.objects.all()[0].name, 'test') self.assertEqual(IssueThread.objects.all()[0].state, 'pending_new') self.assertEqual(IssueThread.objects.all()[0].assigned_to, None) - self.assertTrue( Email.objects.all()[2].raw_file.path) + self.assertTrue(Email.objects.all()[2].raw_file.path) def test_handle_client_reply_2(self): issue_thread = IssueThread.objects.create( @@ -285,7 +314,7 @@ class LMTPHandlerTestCase(TestCase): # TODO replace with less hacky test self.assertEqual(IssueThread.objects.all()[0].name, 'test') self.assertEqual(IssueThread.objects.all()[0].state, 'pending_open') self.assertEqual(IssueThread.objects.all()[0].assigned_to, None) - self.assertTrue( Email.objects.all()[2].raw_file.path) + self.assertTrue(Email.objects.all()[2].raw_file.path) def test_mail_reply(self): issue_thread = IssueThread.objects.create( @@ -783,6 +812,44 @@ dGVzdGltYWdl self.assertEqual(None, IssueThread.objects.all()[0].assigned_to) aiosmtplib.send.assert_called_once() + def test_mail_spam_header(self): + from aiosmtpd.smtp import Envelope + from asgiref.sync import async_to_sync + import aiosmtplib + aiosmtplib.send = make_mocked_coro() + handler = LMTPHandler() + server = mock.Mock() + session = mock.Mock() + envelope = Envelope() + envelope.mail_from = 'test1@test' + envelope.rcpt_tos = ['test2@test'] + envelope.content = b'''Subject: test +From: test1@test +To: test2@test +Message-ID: <1@test> +X-Spam: Yes + +test''' + result = async_to_sync(handler.handle_DATA)(server, session, envelope) + + self.assertEqual(result, '250 Message accepted for delivery') + self.assertEqual(len(Email.objects.all()), 1) # do not send auto reply if spam is suspected + self.assertEqual(len(IssueThread.objects.all()), 1) + aiosmtplib.send.assert_not_called() + self.assertEqual('test', Email.objects.all()[0].subject) + self.assertEqual('test1@test', Email.objects.all()[0].sender) + self.assertEqual('test2@test', Email.objects.all()[0].recipient) + self.assertEqual('test', Email.objects.all()[0].body) + self.assertEqual(IssueThread.objects.all()[0], Email.objects.all()[0].issue_thread) + self.assertEqual('<1@test>', Email.objects.all()[0].reference) + self.assertEqual(None, Email.objects.all()[0].in_reply_to) + self.assertEqual('test', IssueThread.objects.all()[0].name) + self.assertEqual('pending_suspected_spam', IssueThread.objects.all()[0].state) + self.assertEqual(None, IssueThread.objects.all()[0].assigned_to) + states = StateChange.objects.filter(issue_thread=IssueThread.objects.all()[0]) + self.assertEqual(1, len(states)) + self.assertEqual('pending_suspected_spam', states[0].state) + def test_mail_4byte_unicode_emoji(self): from aiosmtpd.smtp import Envelope from asgiref.sync import async_to_sync @@ -887,6 +954,59 @@ hello \xe4\xf6\xfc''' self.assertEqual(1, len(states)) self.assertEqual('pending_new', states[0].state) + def test_mail_windows_1252(self): + from aiosmtpd.smtp import Envelope + from asgiref.sync import async_to_sync + import aiosmtplib + + aiosmtplib.send = make_mocked_coro() + + handler = LMTPHandler() + server = mock.Mock() + session = mock.Mock() + envelope = Envelope() + + envelope.mail_from = 'test1@test' + envelope.rcpt_tos = ['test2@test'] + + envelope.content = b'''Subject: test +From: test1@test +To: test2@test +Message-ID: <1@test> +Content-Type: text/html; charset=windows-1252 +Content-Transfer-Encoding: quoted-printable + +=0D=0Ahello=''' + + result = async_to_sync(handler.handle_DATA)(server, session, envelope) + self.assertEqual('250 Message accepted for delivery', result) + self.assertEqual(2, len(Email.objects.all())) + self.assertEqual(1, len(IssueThread.objects.all())) + aiosmtplib.send.assert_called_once() + self.assertEqual('test', Email.objects.all()[0].subject) + self.assertEqual('test1@test', Email.objects.all()[0].sender) + self.assertEqual('test2@test', Email.objects.all()[0].recipient) + self.assertEqual('\r\nhello', Email.objects.all()[0].body) + self.assertEqual(IssueThread.objects.all()[0], Email.objects.all()[0].issue_thread) + self.assertEqual('<1@test>', Email.objects.all()[0].reference) + self.assertEqual(None, Email.objects.all()[0].in_reply_to) + self.assertEqual(expected_auto_reply_subject.format('test', IssueThread.objects.all()[0].short_uuid()), + Email.objects.all()[1].subject) + self.assertEqual('test2@test', Email.objects.all()[1].sender) + self.assertEqual('test1@test', Email.objects.all()[1].recipient) + self.assertEqual(expected_auto_reply.format(IssueThread.objects.all()[0].short_uuid()), + Email.objects.all()[1].body) + self.assertEqual(IssueThread.objects.all()[0], Email.objects.all()[1].issue_thread) + self.assertTrue(Email.objects.all()[1].reference.startswith("<")) + self.assertTrue(Email.objects.all()[1].reference.endswith("@localhost>")) + self.assertEqual("<1@test>", Email.objects.all()[1].in_reply_to) + self.assertEqual('test', IssueThread.objects.all()[0].name) + self.assertEqual('pending_new', IssueThread.objects.all()[0].state) + self.assertEqual(None, IssueThread.objects.all()[0].assigned_to) + states = StateChange.objects.filter(issue_thread=IssueThread.objects.all()[0]) + self.assertEqual(1, len(states)) + self.assertEqual('pending_new', states[0].state) + def test_mail_quoted_printable_transfer_encoding(self): from aiosmtpd.smtp import Envelope from asgiref.sync import async_to_sync @@ -939,3 +1059,146 @@ hello =C3=A4=C3=B6=C3=BC''' states = StateChange.objects.filter(issue_thread=IssueThread.objects.all()[0]) self.assertEqual(1, len(states)) self.assertEqual('pending_new', states[0].state) + + def test_text_with_attachment2(self): + from aiosmtpd.smtp import Envelope + from asgiref.sync import async_to_sync + import aiosmtplib + aiosmtplib.send = make_mocked_coro() + handler = LMTPHandler() + server = mock.Mock() + session = mock.Mock() + envelope = Envelope() + envelope.mail_from = 'test1@test' + envelope.rcpt_tos = ['test2@test'] + envelope.content = b'''Subject: test +From: test1@test +To: test2@test +Message-ID: <1@test> +Content-Type: multipart/mixed; boundary="abc" +Content-Disposition: inline +Content-Transfer-Encoding: 8bit + +--abc +Content-Type: text/plain; charset=utf-8 +Content-Disposition: inline +Content-Transfer-Encoding: 8bit + +test1 + +--abc +Content-Type: image/jpeg; name="test.jpg" +Content-Disposition: attachment; filename="test.jpg" +Content-Transfer-Encoding: base64 +Content-ID: <1> +X-Attachment-Id: 1 + +dGVzdGltYWdl + +--abc--''' + + result = async_to_sync(handler.handle_DATA)(server, session, envelope) + self.assertEqual(result, '250 Message accepted for delivery') + self.assertEqual(len(Email.objects.all()), 2) + self.assertEqual(len(IssueThread.objects.all()), 1) + aiosmtplib.send.assert_called_once() + self.assertEqual('test', Email.objects.all()[0].subject) + self.assertEqual('test1@test', Email.objects.all()[0].sender) + self.assertEqual('test2@test', Email.objects.all()[0].recipient) + self.assertEqual('test1\n', Email.objects.all()[0].body) + self.assertEqual(IssueThread.objects.all()[0], Email.objects.all()[0].issue_thread) + self.assertEqual('<1@test>', Email.objects.all()[0].reference) + self.assertEqual(None, Email.objects.all()[0].in_reply_to) + self.assertEqual(expected_auto_reply_subject.format('test', IssueThread.objects.all()[0].short_uuid()), + Email.objects.all()[1].subject) + self.assertEqual('test2@test', Email.objects.all()[1].sender) + self.assertEqual('test1@test', Email.objects.all()[1].recipient) + self.assertEqual(expected_auto_reply.format(IssueThread.objects.all()[0].short_uuid()), + Email.objects.all()[1].body) + self.assertEqual(IssueThread.objects.all()[0], Email.objects.all()[1].issue_thread) + self.assertTrue(Email.objects.all()[1].reference.startswith("<")) + self.assertTrue(Email.objects.all()[1].reference.endswith("@localhost>")) + self.assertEqual("<1@test>", Email.objects.all()[1].in_reply_to) + self.assertEqual('test', IssueThread.objects.all()[0].name) + self.assertEqual('pending_new', IssueThread.objects.all()[0].state) + self.assertEqual(None, IssueThread.objects.all()[0].assigned_to) + states = StateChange.objects.filter(issue_thread=IssueThread.objects.all()[0]) + self.assertEqual(1, len(states)) + self.assertEqual('pending_new', states[0].state) + self.assertEqual(1, len(EmailAttachment.objects.all())) + self.assertEqual(1, EmailAttachment.objects.all()[0].id) + self.assertEqual('image/jpeg', EmailAttachment.objects.all()[0].mime_type) + self.assertEqual('test.jpg', EmailAttachment.objects.all()[0].name) + file_content = EmailAttachment.objects.all()[0].file.read() + self.assertEqual(b'testimage', file_content) + + + def test_text_non_utf8_in_multipart(self): + from aiosmtpd.smtp import Envelope + from asgiref.sync import async_to_sync + import aiosmtplib + + aiosmtplib.send = make_mocked_coro() + + handler = LMTPHandler() + server = mock.Mock() + session = mock.Mock() + envelope = Envelope() + + envelope.mail_from = 'test1@test' + envelope.rcpt_tos = ['test2@test'] + + envelope.content = b'''Subject: test +From: test1@test +To: test2@test +Message-ID: <1@test> +Content-Type: multipart/alternative; boundary="abc" + +--abc +Content-Type: text/plain; charset=utf-8 +Content-Transfer-Encoding: 8bit + +test1 + +--abc +Content-Type: text/plain; charset=iso-8859-1 +Content-Transfer-Encoding: quoted-printable + +hello =E4 + +--abc +Content-Type: text/plain; charset=windows-1252 +Content-Transfer-Encoding: quoted-printable + +=0D=0Ahello + +--abc--''' + + result = async_to_sync(handler.handle_DATA)(server, session, envelope) + self.assertEqual(result, '250 Message accepted for delivery') + self.assertEqual(len(Email.objects.all()), 2) + self.assertEqual(len(IssueThread.objects.all()), 1) + aiosmtplib.send.assert_called_once() + self.assertEqual('test', Email.objects.all()[0].subject) + self.assertEqual('test1@test', Email.objects.all()[0].sender) + self.assertEqual('test2@test', Email.objects.all()[0].recipient) + self.assertEqual('test1\nhello ä\n\r\nhello\n', Email.objects.all()[0].body) + self.assertEqual(IssueThread.objects.all()[0], Email.objects.all()[0].issue_thread) + self.assertEqual('<1@test>', Email.objects.all()[0].reference) + self.assertEqual(None, Email.objects.all()[0].in_reply_to) + self.assertEqual(expected_auto_reply_subject.format('test', IssueThread.objects.all()[0].short_uuid()), + Email.objects.all()[1].subject) + self.assertEqual('test2@test', Email.objects.all()[1].sender) + self.assertEqual('test1@test', Email.objects.all()[1].recipient) + self.assertEqual(expected_auto_reply.format(IssueThread.objects.all()[0].short_uuid()), + Email.objects.all()[1].body) + self.assertEqual(IssueThread.objects.all()[0], Email.objects.all()[1].issue_thread) + self.assertTrue(Email.objects.all()[1].reference.startswith("<")) + self.assertTrue(Email.objects.all()[1].reference.endswith("@localhost>")) + self.assertEqual("<1@test>", Email.objects.all()[1].in_reply_to) + self.assertEqual('test', IssueThread.objects.all()[0].name) + self.assertEqual('pending_new', IssueThread.objects.all()[0].state) + self.assertEqual(None, IssueThread.objects.all()[0].assigned_to) + states = StateChange.objects.filter(issue_thread=IssueThread.objects.all()[0]) + self.assertEqual(1, len(states)) + self.assertEqual('pending_new', states[0].state) diff --git a/core/requirements.dev.txt b/core/requirements.dev.txt index 61a5b51..ed037b6 100644 --- a/core/requirements.dev.txt +++ b/core/requirements.dev.txt @@ -13,7 +13,7 @@ Automat==22.10.0 beautifulsoup4==4.12.2 bs4==0.0.1 certifi==2023.11.17 -cffi==1.17.1 +#cffi==1.16.0 channels==4.0.0 channels-redis==4.1.0 charset-normalizer==3.3.2 @@ -40,12 +40,12 @@ inflection==0.5.1 itypes==1.2.0 Jinja2==3.1.2 MarkupSafe==2.1.3 -msgpack==1.1.0 -msgpack-python==0.5.6 +#msgpack==1.0.7 +#msgpack-python==0.5.6 multidict==6.0.5 openapi-codec==1.3.2 packaging==23.2 -Pillow==10.4.0 +Pillow==11.1.0 pyasn1==0.5.1 pyasn1-modules==0.3.0 pycares==4.4.0 @@ -69,7 +69,6 @@ typing_extensions==4.8.0 uritemplate==4.1.1 urllib3==2.1.0 uvicorn==0.24.0.post1 -watchfiles==0.21.0 websockets==12.0 yarl==1.9.4 zope.interface==6.1 diff --git a/core/tickets/api_v2.py b/core/tickets/api_v2.py index f439275..6e34465 100644 --- a/core/tickets/api_v2.py +++ b/core/tickets/api_v2.py @@ -131,14 +131,71 @@ def add_comment(request, pk): def filter_issues(issues, query): - query_tokens = query.split(' ') + query_tokens = query.lower().split(' ') + matches = [] for issue in issues: value = 0 + if "T#" + issue.short_uuid() in query: + value += 12 + matches.append( + {'type': 'ticket_uuid', 'text': f'is exactly {issue.short_uuid()} and matched "T#{issue.short_uuid()}"'}) + elif "#" + issue.short_uuid() in query: + value += 11 + matches.append( + {'type': 'ticket_uuid', 'text': f'is exactly {issue.short_uuid()} and matched "#{issue.short_uuid()}"'}) + elif issue.short_uuid() in query: + value += 10 + matches.append({'type': 'ticket_uuid', 'text': f'is exactly {issue.short_uuid()}'}) + if "T#" + str(issue.id) in query: + value += 10 + matches.append({'type': 'ticket_id', 'text': f'is exactly {issue.id} and matched "T#{issue.id}"'}) + elif "#" + str(issue.id) in query: + value += 7 + matches.append({'type': 'ticket_id', 'text': f'is exactly {issue.id} and matched "#{issue.id}"'}) + elif str(issue.id) in query: + value += 4 + matches.append({'type': 'ticket_id', 'text': f'is exactly {issue.id}'}) + for item in issue.related_items: + if "I#" + str(item.id) in query: + value += 8 + matches.append({'type': 'item_id', 'text': f'is exactly {item.id} and matched "I#{item.id}"'}) + elif "#" + str(item.id) in query: + value += 5 + matches.append({'type': 'item_id', 'text': f'is exactly {item.id} and matched "#{item.id}"'}) + elif str(item.id) in query: + value += 3 + matches.append({'type': 'item_id', 'text': f'is exactly {item.id}'}) + for token in query_tokens: + if token in item.description.lower(): + value += 1 + matches.append({'type': 'item_description', 'text': f'contains {token}'}) + for comment in item.comments.all(): + for token in query_tokens: + if token in comment.comment.lower(): + value += 1 + matches.append({'type': 'item_comment', 'text': f'contains {token}'}) for token in query_tokens: - if token in issue.description: + if token in issue.name.lower(): value += 1 + matches.append({'type': 'ticket_name', 'text': f'contains {token}'}) + for comment in issue.comments.all(): + for token in query_tokens: + if token in comment.comment.lower(): + value += 1 + matches.append({'type': 'ticket_comment', 'text': f'contains {token}'}) + for email in issue.emails.all(): + for token in query_tokens: + if token in email.subject.lower(): + value += 1 + matches.append({'type': 'email_subject', 'text': f'contains {token}'}) + if token in email.body.lower(): + value += 1 + matches.append({'type': 'email_body', 'text': f'contains {token}'}) + if token in email.sender.lower(): + value += 1 + matches.append({'type': 'email_sender', 'text': f'contains {token}'}) if value > 0: - yield {'search_score': value, 'issue': issue} + yield {'search_score': value, 'issue': issue, 'search_matches': matches} @api_view(['GET']) @@ -148,7 +205,10 @@ def search_issues(request, event_slug, query): event = Event.objects.get(slug=event_slug) if not request.user.has_event_perm(event, 'view_issuethread'): return Response(status=403) - items = filter_issues(IssueThread.objects.filter(event=event), b64decode(query).decode('utf-8')) + serializer = IssueSerializer() + queryset = IssueThread.objects.filter(event=event) + items = filter_issues(queryset.prefetch_related(*serializer.Meta.prefetch_related_fields), + b64decode(query).decode('utf-8')) return Response(SearchResultSerializer(items, many=True).data) except Event.DoesNotExist: return Response(status=404) diff --git a/core/tickets/migrations/0013_alter_statechange_state.py b/core/tickets/migrations/0013_alter_statechange_state.py new file mode 100644 index 0000000..6a99ce5 --- /dev/null +++ b/core/tickets/migrations/0013_alter_statechange_state.py @@ -0,0 +1,18 @@ +# Generated by Django 4.2.7 on 2025-03-15 21:31 + +from django.db import migrations, models + + +class Migration(migrations.Migration): + + dependencies = [ + ('tickets', '0012_remove_issuethread_related_items_and_more'), + ] + + operations = [ + migrations.AlterField( + model_name='statechange', + name='state', + field=models.CharField(choices=[('pending_new', 'New'), ('pending_open', 'Open'), ('pending_shipping', 'Needs to be shipped'), ('pending_physical_confirmation', 'Needs to be confirmed physically'), ('pending_return', 'Needs to be returned'), ('pending_postponed', 'Postponed'), ('pending_suspected_spam', 'Suspected Spam'), ('waiting_details', 'Waiting for details'), ('waiting_pre_shipping', 'Waiting for Address/Shipping Info'), ('closed_returned', 'Closed: Returned'), ('closed_shipped', 'Closed: Shipped'), ('closed_not_found', 'Closed: Not found'), ('closed_not_our_problem', 'Closed: Not our problem'), ('closed_duplicate', 'Closed: Duplicate'), ('closed_timeout', 'Closed: Timeout'), ('closed_spam', 'Closed: Spam'), ('closed_nothing_missing', 'Closed: Nothing missing'), ('closed_wtf', 'Closed: WTF'), ('found_open', 'Item Found and stored externally'), ('found_closed', 'Item Found and stored externally and closed')], default='pending_new', max_length=255), + ), + ] diff --git a/core/tickets/models.py b/core/tickets/models.py index f7ddb7b..794c8e4 100644 --- a/core/tickets/models.py +++ b/core/tickets/models.py @@ -16,6 +16,7 @@ STATE_CHOICES = ( ('pending_physical_confirmation', 'Needs to be confirmed physically'), ('pending_return', 'Needs to be returned'), ('pending_postponed', 'Postponed'), + ('pending_suspected_spam', 'Suspected Spam'), ('waiting_details', 'Waiting for details'), ('waiting_pre_shipping', 'Waiting for Address/Shipping Info'), ('closed_returned', 'Closed: Returned'), @@ -46,6 +47,11 @@ class IssueThread(SoftDeleteModel): event = models.ForeignKey(Event, null=True, on_delete=models.SET_NULL, related_name='issue_threads') manually_created = models.BooleanField(default=False) + def __init__(self, *args, **kwargs): + if 'initial_state' in kwargs: + self._initial_state = kwargs.pop('initial_state') + super().__init__(*args, **kwargs) + def short_uuid(self): return self.uuid[:8] @@ -110,8 +116,9 @@ def set_uuid(sender, instance, **kwargs): @receiver(post_save, sender=IssueThread) def create_issue_thread(sender, instance, created, **kwargs): - if created: - StateChange.objects.create(issue_thread=instance, state='pending_new') + if created and instance.state_changes.count() == 0: + initial_state = getattr(instance, '_initial_state', None) + StateChange.objects.create(issue_thread=instance, state=initial_state if initial_state else 'pending_new') class Comment(models.Model): diff --git a/core/tickets/serializers.py b/core/tickets/serializers.py index 50cdb72..ff695b1 100644 --- a/core/tickets/serializers.py +++ b/core/tickets/serializers.py @@ -139,10 +139,12 @@ class IssueSerializer(BasicIssueSerializer): class SearchResultSerializer(serializers.Serializer): search_score = serializers.IntegerField() - item = IssueSerializer() + search_matches = serializers.ListField(child=serializers.DictField()) + issue = IssueSerializer() def to_representation(self, instance): - return {**IssueSerializer(instance['item']).data, 'search_score': instance['search_score']} + return {**IssueSerializer(instance['issue']).data, 'search_score': instance['search_score'], + 'search_matches': instance['search_matches']} class Meta: model = IssueThread diff --git a/core/tickets/shared_serializers.py b/core/tickets/shared_serializers.py index ac16d81..3d46013 100644 --- a/core/tickets/shared_serializers.py +++ b/core/tickets/shared_serializers.py @@ -9,6 +9,7 @@ class RelationSerializer(serializers.ModelSerializer): class Meta: model = ItemRelation fields = ('id', 'status', 'timestamp', 'item', 'issue_thread') + read_only_fields = ('id', 'timestamp') class BasicIssueSerializer(serializers.ModelSerializer): diff --git a/core/tickets/tests/v2/test_tickets.py b/core/tickets/tests/v2/test_tickets.py index 9720625..d7bb346 100644 --- a/core/tickets/tests/v2/test_tickets.py +++ b/core/tickets/tests/v2/test_tickets.py @@ -4,6 +4,7 @@ from django.test import TestCase, Client from authentication.models import ExtendedUser from inventory.models import Event, Container, Item +from inventory.models import Comment as ItemComment from mail.models import Email, EmailAttachment from tickets.models import IssueThread, StateChange, Comment, ItemRelation, Assignment from django.contrib.auth.models import Permission @@ -383,15 +384,108 @@ class IssueSearchTest(TestCase): def setUp(self): super().setUp() - self.event = Event.objects.create(slug='EVENT', name='Event') self.user = ExtendedUser.objects.create_user('testuser', 'test', 'test') self.user.user_permissions.add(*Permission.objects.all()) self.user.save() + self.event = Event.objects.create(slug='EVENT', name='Event') + self.box = Container.objects.create(name='box1') + self.item = Item.objects.create(container=self.box, description="foo", event=self.event) self.token = AuthToken.objects.create(user=self.user) self.client = Client(headers={'Authorization': 'Token ' + self.token[1]}) - def test_search(self): + def test_search_empty_result(self): search_query = b64encode(b'abc').decode('utf-8') response = self.client.get(f'/api/2/{self.event.slug}/tickets/{search_query}/') self.assertEqual(200, response.status_code) self.assertEqual([], response.json()) + + def test_search(self): + now = datetime.now() + issue = IssueThread.objects.create( + name="test issue Abc", + event=self.event, + ) + mail1 = Email.objects.create( + subject='test', + body='test aBc', + sender='bar@test', + recipient='2@test', + issue_thread=issue, + timestamp=now, + ) + mail2 = Email.objects.create( + subject='Re: test', + body='test', + sender='2@test', + recipient='1@test', + issue_thread=issue, + in_reply_to=mail1.reference, + timestamp=now + timedelta(seconds=2), + ) + assignment = Assignment.objects.create( + issue_thread=issue, + assigned_to=self.user, + timestamp=now + timedelta(seconds=3), + ) + comment = Comment.objects.create( + issue_thread=issue, + comment="test deF", + timestamp=now + timedelta(seconds=4), + ) + match = ItemRelation.objects.create( + issue_thread=issue, + item=self.item, + timestamp=now + timedelta(seconds=5), + ) + item_comment = ItemComment.objects.create( + item=self.item, + comment="baz", + timestamp=now + timedelta(seconds=6), + ) + search_query = b64encode(b'abC').decode('utf-8') + response = self.client.get(f'/api/2/{self.event.slug}/tickets/{search_query}/') + self.assertEqual(200, response.status_code) + self.assertEqual(1, len(response.json())) + self.assertEqual(issue.id, response.json()[0]['id']) + score2 = response.json()[0]['search_score'] + + search_query = b64encode(b'dEf').decode('utf-8') + response = self.client.get(f'/api/2/{self.event.slug}/tickets/{search_query}/') + self.assertEqual(200, response.status_code) + self.assertEqual(1, len(response.json())) + self.assertEqual(issue.id, response.json()[0]['id']) + score1 = response.json()[0]['search_score'] + + search_query = b64encode(b'ghi').decode('utf-8') + response = self.client.get(f'/api/2/{self.event.slug}/tickets/{search_query}/') + self.assertEqual(200, response.status_code) + self.assertEqual(0, len(response.json())) + + search_query = b64encode(b'Abc def').decode('utf-8') + response = self.client.get(f'/api/2/{self.event.slug}/tickets/{search_query}/') + self.assertEqual(200, response.status_code) + self.assertEqual(1, len(response.json())) + self.assertEqual(issue.id, response.json()[0]['id']) + score3 = response.json()[0]['search_score'] + + self.assertGreater(score3, score2) + self.assertGreater(score2, score1) + self.assertGreater(score1, 0) + + search_query = b64encode(b'foo').decode('utf-8') + response = self.client.get(f'/api/2/{self.event.slug}/tickets/{search_query}/') + self.assertEqual(200, response.status_code) + self.assertEqual(1, len(response.json())) + self.assertEqual(issue.id, response.json()[0]['id']) + + search_query = b64encode(b'bar').decode('utf-8') + response = self.client.get(f'/api/2/{self.event.slug}/tickets/{search_query}/') + self.assertEqual(200, response.status_code) + self.assertEqual(1, len(response.json())) + self.assertEqual(issue.id, response.json()[0]['id']) + + search_query = b64encode(b'baz').decode('utf-8') + response = self.client.get(f'/api/2/{self.event.slug}/tickets/{search_query}/') + self.assertEqual(200, response.status_code) + self.assertEqual(1, len(response.json())) + self.assertEqual(issue.id, response.json()[0]['id']) diff --git a/deploy/ansible/playbooks/deploy-c3lf-sys3.yml b/deploy/ansible/playbooks/deploy-c3lf-sys3.yml index 544b4e4..4005146 100644 --- a/deploy/ansible/playbooks/deploy-c3lf-sys3.yml +++ b/deploy/ansible/playbooks/deploy-c3lf-sys3.yml @@ -345,6 +345,13 @@ notify: - restart postfix + - name: configure rspamd dkim + template: + src: templates/rspamd-dkim.cf.j2 + dest: /etc/rspamd/local.d/dkim_signing.conf + notify: + - restart rspamd + - name: configure rspamd copy: content: | diff --git a/deploy/ansible/playbooks/templates/postfix.cf.j2 b/deploy/ansible/playbooks/templates/postfix.cf.j2 index f80d69b..f6e0b09 100644 --- a/deploy/ansible/playbooks/templates/postfix.cf.j2 +++ b/deploy/ansible/playbooks/templates/postfix.cf.j2 @@ -32,12 +32,11 @@ smtp_tls_session_cache_database = btree:${data_directory}/smtp_scache smtpd_relay_restrictions = permit_mynetworks permit_sasl_authenticated defer_unauth_destination -myhostname = polaris.c3lf.de +myhostname = polaris.lab.or.it alias_maps = hash:/etc/aliases alias_database = hash:/etc/aliases myorigin = /etc/mailname mydestination = $myhostname, , localhost -relayhost = firefly.lab.or.it mynetworks = 127.0.0.0/8 [::ffff:127.0.0.0]/104 [::1]/128 mailbox_size_limit = 0 recipient_delimiter = + diff --git a/deploy/ansible/playbooks/templates/rspamd-dkim.cf.j2 b/deploy/ansible/playbooks/templates/rspamd-dkim.cf.j2 new file mode 100644 index 0000000..9e21aa5 --- /dev/null +++ b/deploy/ansible/playbooks/templates/rspamd-dkim.cf.j2 @@ -0,0 +1,79 @@ +# local.d/dkim_signing.conf + +enabled = true; + +# If false, messages with empty envelope from are not signed +allow_envfrom_empty = true; + +# If true, envelope/header domain mismatch is ignored +allow_hdrfrom_mismatch = false; + +# If true, multiple from headers are allowed (but only first is used) +allow_hdrfrom_multiple = false; + +# If true, username does not need to contain matching domain +allow_username_mismatch = false; + +# Default path to key, can include '$domain' and '$selector' variables +path = "/var/lib/rspamd/dkim/$domain.$selector.key"; + +# Default selector to use +selector = "dkim"; + +# If false, messages from authenticated users are not selected for signing +sign_authenticated = true; + +# If false, messages from local networks are not selected for signing +sign_local = true; + +# Map file of IP addresses/subnets to consider for signing +# sign_networks = "/some/file"; # or url + +# Symbol to add when message is signed +symbol = "DKIM_SIGNED"; + +# Whether to fallback to global config +try_fallback = true; + +# Domain to use for DKIM signing: can be "header" (MIME From), "envelope" (SMTP From), "recipient" (SMTP To), "auth" (SMTP username) or directly specified domain name +use_domain = "header"; + +# Domain to use for DKIM signing when sender is in sign_networks ("header"/"envelope"/"auth") +#use_domain_sign_networks = "header"; + +# Domain to use for DKIM signing when sender is a local IP ("header"/"envelope"/"auth") +#use_domain_sign_local = "header"; + +# Whether to normalise domains to eSLD +use_esld = true; + +# Whether to get keys from Redis +use_redis = false; + +# Hash for DKIM keys in Redis +key_prefix = "DKIM_KEYS"; + +# map of domains -> names of selectors (since rspamd 1.5.3) +#selector_map = "/etc/rspamd/dkim_selectors.map"; + +# map of domains -> paths to keys (since rspamd 1.5.3) +#path_map = "/etc/rspamd/dkim_paths.map"; + +# If `true` get pubkey from DNS record and check if it matches private key +check_pubkey = false; +# Set to `false` if you want to skip signing if public and private keys mismatch +allow_pubkey_mismatch = true; + +# Domain specific settings +domain { + # Domain name is used as key + c3lf.de { + + # Private key path + path = "/var/lib/rspamd/dkim/{{ mail_domain }}.key"; + + # Selector + selector = "{{ mail_domain }}"; + } +} + diff --git a/deploy/dev/Dockerfile.backend b/deploy/dev/Dockerfile.backend index 19c2efd..57ab856 100644 --- a/deploy/dev/Dockerfile.backend +++ b/deploy/dev/Dockerfile.backend @@ -1,13 +1,8 @@ -FROM python:3.11-bookworm +FROM python:3.11-slim-bookworm LABEL authors="lagertonne" ENV PYTHONUNBUFFERED 1 RUN mkdir /code WORKDIR /code COPY requirements.dev.txt /code/ -COPY requirements.prod.txt /code/ -RUN apt update && apt install -y mariadb-client -RUN pip install -r requirements.dev.txt -RUN pip install -r requirements.prod.txt -RUN pip install mysqlclient -COPY .. /code/ \ No newline at end of file +RUN pip install -r requirements.dev.txt \ No newline at end of file diff --git a/deploy/dev/Dockerfile.frontend b/deploy/dev/Dockerfile.frontend index 0a41d1a..a8fd652 100644 --- a/deploy/dev/Dockerfile.frontend +++ b/deploy/dev/Dockerfile.frontend @@ -1,4 +1,4 @@ -FROM docker.io/node:22 +FROM node:22-alpine RUN mkdir /web WORKDIR /web diff --git a/deploy/dev/docker-compose.yml b/deploy/dev/docker-compose.yml index 8580127..cf1bfdc 100644 --- a/deploy/dev/docker-compose.yml +++ b/deploy/dev/docker-compose.yml @@ -1,3 +1,4 @@ +name: c3lf-sys3-dev services: core: build: @@ -6,11 +7,12 @@ services: command: bash -c 'python manage.py migrate && python testdata.py && python manage.py runserver 0.0.0.0:8000' environment: - HTTP_HOST=core - - DB_FILE=dev.db + - DB_FILE=.local/dev.db - DEBUG_MODE_ACTIVE=true volumes: - - ../../core:/code - - ../testdata.py:/code/testdata.py + - ../../core:/code:ro + - ../testdata.py:/code/testdata.py:ro + - backend_context:/code/.local ports: - "8000:8000" @@ -20,10 +22,12 @@ services: dockerfile: ../deploy/dev/Dockerfile.frontend command: npm run serve volumes: - - ../../web:/web:ro - - /web/node_modules + - ../../web/src:/web/src - ./vue.config.js:/web/vue.config.js ports: - "8080:8080" depends_on: - core + +volumes: + backend_context: \ No newline at end of file diff --git a/deploy/testing/Dockerfile.backend b/deploy/testing/Dockerfile.backend index c968994..06e494f 100644 --- a/deploy/testing/Dockerfile.backend +++ b/deploy/testing/Dockerfile.backend @@ -1,11 +1,11 @@ -FROM python:3.11-bookworm +FROM python:3.11-slim-bookworm LABEL authors="lagertonne" ENV PYTHONUNBUFFERED 1 RUN mkdir /code WORKDIR /code -COPY requirements.prod.txt /code/ -RUN apt update && apt install -y mariadb-client -RUN pip install -r requirements.prod.txt +RUN apt update && apt install -y pkg-config mariadb-client default-libmysqlclient-dev build-essential RUN pip install mysqlclient +COPY requirements.prod.txt /code/ +RUN pip install -r requirements.prod.txt COPY .. /code/ \ No newline at end of file diff --git a/deploy/testing/Dockerfile.frontend b/deploy/testing/Dockerfile.frontend index 0a41d1a..a8fd652 100644 --- a/deploy/testing/Dockerfile.frontend +++ b/deploy/testing/Dockerfile.frontend @@ -1,4 +1,4 @@ -FROM docker.io/node:22 +FROM node:22-alpine RUN mkdir /web WORKDIR /web diff --git a/deploy/testing/docker-compose.yml b/deploy/testing/docker-compose.yml index e93e901..4a82289 100644 --- a/deploy/testing/docker-compose.yml +++ b/deploy/testing/docker-compose.yml @@ -1,3 +1,4 @@ +name: c3lf-sys3-testing services: redis: image: redis @@ -20,7 +21,7 @@ services: build: context: ../../core dockerfile: ../deploy/testing/Dockerfile.backend - command: bash -c 'python manage.py migrate && python /code/server.py' + command: bash -c 'python manage.py migrate && python testdata.py && python /code/server.py' environment: - HTTP_HOST=core - REDIS_HOST=redis @@ -29,13 +30,17 @@ services: - DB_NAME=system3 - DB_USER=system3 - DB_PASSWORD=system3 + - MAIL_DOMAIN=mail:1025 volumes: - - ../../core:/code + - ../../core:/code:ro + - ../testdata.py:/code/testdata.py:ro + - backend_context:/code ports: - "8000:8000" depends_on: - db - redis + - mail frontend: build: @@ -44,12 +49,28 @@ services: command: npm run serve volumes: - ../../web:/web:ro - - /web/node_modules - - ./vue.config.js:/web/vue.config.js + - ./vue.config.js:/web/vue.config.js:ro + - frontend_context:/web ports: - "8080:8080" depends_on: - core + mail: + image: docker.io/axllent/mailpit + volumes: + - mailpit_data:/data + ports: + - 8025:8025 + - 1025:1025 + environment: + MP_MAX_MESSAGES: 5000 + MP_DATABASE: /data/mailpit.db + MP_SMTP_AUTH_ACCEPT_ANY: 1 + MP_SMTP_AUTH_ALLOW_INSECURE: 1 + volumes: - mariadb_data: \ No newline at end of file + mariadb_data: + mailpit_data: + frontend_context: + backend_context: diff --git a/web/node_modules/.forgit_fordocker b/web/node_modules/.forgit_fordocker new file mode 100644 index 0000000..e69de29 diff --git a/web/src/components/AddItemModal.vue b/web/src/components/AddItemModal.vue index a3c23fd..24bd449 100644 --- a/web/src/components/AddItemModal.vue +++ b/web/src/components/AddItemModal.vue @@ -2,7 +2,29 @@
diff --git a/web/src/components/AddTicketModal.vue b/web/src/components/AddTicketModal.vue index 37d539c..b407670 100644 --- a/web/src/components/AddTicketModal.vue +++ b/web/src/components/AddTicketModal.vue @@ -19,11 +19,10 @@ diff --git a/web/src/components/Navbar.vue b/web/src/components/Navbar.vue index ccfb0f0..7f5e257 100644 --- a/web/src/components/Navbar.vue +++ b/web/src/components/Navbar.vue @@ -115,10 +115,10 @@ export default { this.$router.push(link); }, isItemView() { - return this.getActiveView === 'items' || this.getActiveView === 'item'; + return this.getActiveView === 'items' || this.getActiveView === 'item' || this.getActiveView === 'item_search'; }, isTicketView() { - return this.getActiveView === 'tickets' || this.getActiveView === 'ticket'; + return this.getActiveView === 'tickets' || this.getActiveView === 'ticket' || this.getActiveView === 'ticket_search'; }, setLayout(layout) { if (this.route.query.layout === layout) diff --git a/web/src/components/Timeline.vue b/web/src/components/Timeline.vue index 0099c03..ee0ca4d 100644 --- a/web/src/components/Timeline.vue +++ b/web/src/components/Timeline.vue @@ -24,6 +24,15 @@ + + + + + + + + + @@ -35,6 +44,9 @@ + + +

{{ item }}

  • @@ -58,10 +70,16 @@ import TimelineShippingVoucher from "@/components/TimelineShippingVoucher.vue"; import AsyncButton from "@/components/inputs/AsyncButton.vue"; import TimelinePlacement from "@/components/TimelinePlacement.vue"; import TimelineRelatedTicket from "@/components/TimelineRelatedTicket.vue"; +import TimelineCreated from "@/components/TimelineCreated.vue"; +import TimelineReturned from "@/components/TimelineReturned.vue"; +import TimelineDeleted from "@/components/TimelineDeleted.vue"; export default { name: 'Timeline', components: { + TimelineDeleted, + TimelineReturned, + TimelineCreated, TimelineRelatedTicket, TimelinePlacement, TimelineShippingVoucher, diff --git a/web/src/components/TimelineCreated.vue b/web/src/components/TimelineCreated.vue new file mode 100644 index 0000000..126272c --- /dev/null +++ b/web/src/components/TimelineCreated.vue @@ -0,0 +1,83 @@ + + + + + \ No newline at end of file diff --git a/web/src/components/TimelineDeleted.vue b/web/src/components/TimelineDeleted.vue new file mode 100644 index 0000000..076467d --- /dev/null +++ b/web/src/components/TimelineDeleted.vue @@ -0,0 +1,83 @@ + + + + + \ No newline at end of file diff --git a/web/src/components/TimelineReturned.vue b/web/src/components/TimelineReturned.vue new file mode 100644 index 0000000..6eb740b --- /dev/null +++ b/web/src/components/TimelineReturned.vue @@ -0,0 +1,83 @@ + + + + + \ No newline at end of file diff --git a/web/src/components/inputs/InputCombo.vue b/web/src/components/inputs/InputCombo.vue index fc64d42..2a291e0 100644 --- a/web/src/components/inputs/InputCombo.vue +++ b/web/src/components/inputs/InputCombo.vue @@ -43,11 +43,11 @@ export default { props: ['label', 'model', 'nameKey', 'uniqueKey', 'options', 'onOptionAdd'], data: ({options, model, nameKey, uniqueKey}) => ({ internalName: model[nameKey], - selectedOption: options.filter(e => e[uniqueKey] == model[uniqueKey])[0], + selectedOption: options.filter(e => e[uniqueKey] === model[uniqueKey])[0], addingOption: false }), computed: { - isValid: ({options, nameKey, internalName}) => options.some(e => e[nameKey] == internalName), + isValid: ({options, nameKey, internalName}) => options.some(e => e[nameKey] === internalName), sortedOptions: ({ options, nameKey @@ -56,7 +56,7 @@ export default { watch: { internalName(newValue) { if (this.isValid) { - if (!this.selectedOption || newValue != this.selectedOption[this.nameKey]) { + if (!this.selectedOption || newValue !== this.selectedOption[this.nameKey]) { this.selectedOption = this.options.filter(e => e[this.nameKey] === newValue)[0]; } this.model[this.nameKey] = this.selectedOption[this.nameKey]; diff --git a/web/src/components/inputs/SearchBox.vue b/web/src/components/inputs/SearchBox.vue index eb32b07..79fb798 100644 --- a/web/src/components/inputs/SearchBox.vue +++ b/web/src/components/inputs/SearchBox.vue @@ -12,6 +12,7 @@ + + \ No newline at end of file diff --git a/web/src/views/Items.vue b/web/src/views/Items.vue index 21b11d1..72b4079 100644 --- a/web/src/views/Items.vue +++ b/web/src/views/Items.vue @@ -67,11 +67,10 @@ + + \ No newline at end of file diff --git a/web/src/views/Tickets.vue b/web/src/views/Tickets.vue index 069ce0e..60d0d1d 100644 --- a/web/src/views/Tickets.vue +++ b/web/src/views/Tickets.vue @@ -26,7 +26,7 @@ :columns="['id', 'name', 'last_activity', 'assigned_to', ...(getEventSlug==='all'?['event']:[])]" :keyName="'state'" :sections="['pending_new', 'pending_open','pending_shipping', - 'pending_physical_confirmation','pending_return','pending_postponed'].map(stateInfo)"> + 'pending_physical_confirmation','pending_return','pending_postponed','pending_suspected_spam'].map(stateInfo)"> diff --git a/web/src/views/admin/Events.vue b/web/src/views/admin/Events.vue index e2ff952..e156d56 100644 --- a/web/src/views/admin/Events.vue +++ b/web/src/views/admin/Events.vue @@ -1,6 +1,7 @@