Compare commits
21 commits
jedi/searc
...
testing
Author | SHA1 | Date | |
---|---|---|---|
9def22a836 | |||
8d45fef627 | |||
756fe4aaad | |||
51ddc8edc3 | |||
e8a92b26fa | |||
d80fb60afd | |||
6b0def543c | |||
1568252112 | |||
9e0540d133 | |||
c2bcd53749 | |||
86b4220eaa | |||
13994a111e | |||
554bc70413 | |||
9395226c5f | |||
c26152d3c5 | |||
70516db074 | |||
2677f4b8b6 | |||
fbbf8352cf | |||
4ea74637a3 | |||
f133ae9e60 | |||
0fa52645c2 |
34 changed files with 10341 additions and 171 deletions
158
README.md
Normal file
158
README.md
Normal file
|
@ -0,0 +1,158 @@
|
|||
# C3LF System3
|
||||
|
||||
the third try to automate lost&found organization for chaos events. not a complete rewrite, but instead building on top
|
||||
of the web frontend of version 2. everything else is new but still API compatible. now with more monorepo.
|
||||
|
||||
## Architecture
|
||||
|
||||
C3LF System3 integrates a Django-Rest-Framework + WebSocket backend, Vue.js frontend SPA and a minimal LMTP mail server
|
||||
integrated with the Django backend. It is additionally deployed with a Postfix mail server as Proxy in front of the
|
||||
LMTP socket, a MariaDB database, a Redis cache and an Nginx reverse proxy that serves the static SPA frontend, proxies
|
||||
the API requests to the backend and serves the media files in cooperation with the Django backend using the
|
||||
`X-Accel-Redirect` header.
|
||||
|
||||
The production deployment is automated using Ansible and there are some Docker Compose configurations for development.
|
||||
|
||||
## Project Structure
|
||||
|
||||
- `core/` Contains the Django backend with database models, API endpoints, migrations, API tests, and mail server
|
||||
functionalities.
|
||||
- `web/` Contains the Vue.js frontend application.
|
||||
- `deploy/` Contains deployment configurations and Docker scripts for various development modes.
|
||||
|
||||
For more information, see the README.md files in the respective directories.
|
||||
|
||||
## Development Modes
|
||||
|
||||
There are currently 4 development modes for this Project:
|
||||
|
||||
- Frontend-Only
|
||||
- Backend-API-Only
|
||||
- Full-Stack-Lite 'dev' (docker)
|
||||
- **[WIP]** Full-Stack 'testing' (docker)
|
||||
|
||||
*Choose the one that is most suited to the feature you want to work on or ist easiest for you to set up ;)*
|
||||
|
||||
For all modes it is assumed that you have `git` installed, have cloned the repository and are in the root directory of
|
||||
the project. Use `git clone https://git.hannover.ccc.de/c3lf/c3lf-system-3.git` to get the official upstream repository.
|
||||
The required packages for each mode are listed separately and also state the specific package name for Debian 12.
|
||||
|
||||
### Frontend-Only
|
||||
|
||||
This mode is for developing the frontend only. It uses the vue-cli-service (webpack) to serve the frontend and watches
|
||||
for changes in the source code to provide hot reloading. The API requests are proxied to the staging backend.
|
||||
|
||||
#### Requirements
|
||||
|
||||
* Node.js (~20.19.0) (`nodejs`)
|
||||
* npm (~9.2.0) (`npm`)
|
||||
|
||||
*Note: The versions are not strict, but these are tested. Other versions might work as well.*
|
||||
|
||||
#### Steps
|
||||
|
||||
```bash
|
||||
cd web
|
||||
npm intall
|
||||
npm run serve
|
||||
```
|
||||
|
||||
Now you can access the frontend at `localhost:8080` and start editing the code in the `web` directory.
|
||||
For more information, see the README.md file in the `web` directory.
|
||||
|
||||
### Backend-API-Only
|
||||
|
||||
This mode is for developing the backend API only. It also specifically excludes most WebSockets and mail server
|
||||
functionalities. Use this mode to focus on the backend API and Database models.
|
||||
|
||||
#### Requirements
|
||||
|
||||
* Python (~3.11) (`python3`)
|
||||
* pip (`python3-pip`)
|
||||
* virtualenv (`python3-venv`)
|
||||
|
||||
*Note: The versions are not strict, but these are tested. Other versions might work as well.*
|
||||
|
||||
#### Steps
|
||||
|
||||
```
|
||||
python -m venv venv
|
||||
source venv/bin/activate
|
||||
pip install -r core/requirements.dev.txt
|
||||
cd core
|
||||
python manage.py test
|
||||
```
|
||||
|
||||
The tests should run successfully to start and you can now start the TDD workflow by adding new failing tests.
|
||||
For more information about the backend and TDD, see the README.md file in the `core` directory.
|
||||
|
||||
### Full-Stack-Lite 'dev' (docker)
|
||||
|
||||
This mode is for developing the both frontend and backend backend at the same time in a containerized environment. It
|
||||
uses the `docker-compose` command to build and run the application in a container. It specifically excludes all mail
|
||||
server and most WebSocket functionalities.
|
||||
|
||||
#### Requirements
|
||||
|
||||
* Docker (`docker.io`)
|
||||
* Docker Compose (`docker-compose`)
|
||||
|
||||
*Note: Depending on your system, the `docker compose` command might be included in general `docker` or `docker-ce`
|
||||
package, or you might want to use podman instead.*
|
||||
|
||||
#### Steps
|
||||
|
||||
```bash
|
||||
docker-compose -f deploy/dev/docker-compose.yml up --build
|
||||
```
|
||||
|
||||
The page should be available at [localhost:8080](http://localhost:8080)
|
||||
This Mode provides a minimal set of testdata, including a user `testuser` with password `testuser`. The test dataset is
|
||||
defined in deploy/testdata.py and can be extended there.
|
||||
|
||||
You can now edit code in `/web` and `/core` and changes will be applied to the running page as soon as the file is
|
||||
saved.
|
||||
|
||||
For details about each part, read `/web/README.md` and `/core/README.md` respectively. To execute commands in the
|
||||
container context use 'exec' like
|
||||
|
||||
```bash
|
||||
docker exec -it c3lf-sys3-dev-core-1 ./manage.py test`
|
||||
```
|
||||
|
||||
### Full-Stack 'testing' (docker)
|
||||
|
||||
**WORK IN PROGRESS**
|
||||
|
||||
*will include postfix, mariadb, redis, nginx and the ability to test sending mails, receiving mail and websocket based
|
||||
realiteme updates in the frontend. the last step in verification before deploying to the staging system using ansible*
|
||||
|
||||
## Online Instances
|
||||
|
||||
These are deployed using `deploy/ansible/playbooks/deploy-c3lf-sys3.yml` and follow a specific git branch.
|
||||
|
||||
### 'live'
|
||||
|
||||
| URL | [c3lf.de](https://c3lf.de) |
|
||||
|----------------|----------------------------|
|
||||
| **Branch** | live |
|
||||
| **Host** | polaris.lab.or.it |
|
||||
| **Debug Mode** | off |
|
||||
|
||||
This is the **'production' system** and should strictly follow the staging system after all changes have been validated.
|
||||
|
||||
### 'staging'
|
||||
|
||||
| URL | [staging.c3lf.de](https://staging.c3lf.de) |
|
||||
|----------------|--------------------------------------------|
|
||||
| **Branch** | testing |
|
||||
| **Host** | andromeda.lab.or.it |
|
||||
| **Debug Mode** | on |
|
||||
|
||||
This system ist automatically updated by [git.hannover.ccc.de](https://git.hannover.ccc.de/c3lf/c3lf-system-3/) whenever
|
||||
a commit is pushed to the 'testing' branch and the backend tests passed.
|
||||
|
||||
**WARNING: allthough this is the staging system, it is fully functional and contains a copy of the 'production' data, so
|
||||
do not for example reply to tickets for testing purposes as the system WILL SEND AN EMAIL to the person who originally
|
||||
created it. If you want to test something like that, first create you own test ticket by sending an email to
|
||||
`<event>@staging.c3lf.de`**
|
0
core/.local/.forgit_fordocker
Normal file
0
core/.local/.forgit_fordocker
Normal file
68
core/README.md
Normal file
68
core/README.md
Normal file
|
@ -0,0 +1,68 @@
|
|||
# Core
|
||||
|
||||
This directory contains the backend of the C3LF System3 project, which is built using Django and Django Rest Framework.
|
||||
|
||||
## Modules
|
||||
|
||||
- `authentication`: Handles user authentication and authorization.
|
||||
- `files`: Manages file uploads and related operations.
|
||||
- `inventory`: Handles inventory management, including events, containers and items.
|
||||
- `mail`: Manages email-related functionalities, including sending and receiving emails.
|
||||
- `notify_sessions`: Handles real-time notifications and WebSocket sessions.
|
||||
- `tickets`: Manages the ticketing system for issue tracking.
|
||||
|
||||
## Modules Structure
|
||||
|
||||
Most modules follow a similar structure, including the following components:
|
||||
|
||||
- `<module>/models.py`: Contains the database models for the module.
|
||||
- `<module>/serializers.py`: Contains the serializers for the module models.
|
||||
- `<module>/api_<api_version>.py`: Contains the API views and endpoints for the module.
|
||||
- `<module>/migrations/`: Contains database migration files. Needs to contain an `__init__.py` file to be recognized as
|
||||
a Python package and automatically migration creation to work.
|
||||
- `<module>/tests/<api_version>/test_<feature_model_or_testcase>.py`: Contains the test cases for the module.
|
||||
|
||||
## Development Setup
|
||||
|
||||
follow the instructions under 'Backend-API-Only' or 'Fullstack-Lite' in the root level `README.md` to set up a
|
||||
development environment.
|
||||
|
||||
## Test-Driven Development (TDD) Workflow
|
||||
|
||||
The project follows a TDD workflow to ensure code quality and reliability. Here is a step-by-step guide to the TDD
|
||||
process:
|
||||
|
||||
1. **Write a Test**: Start by writing a test case for the new feature or bug fix. Place the test case in the appropriate
|
||||
module within the `<module>/tests/<api_version>/test_<feature_model_or_testcase>.py` file.
|
||||
|
||||
2. **Run the Test**: Execute the test to ensure it fails, confirming that the feature is not yet implemented or the bug
|
||||
exists.
|
||||
```bash
|
||||
python manage.py test
|
||||
```
|
||||
|
||||
3. **Write the Code**: Implement the code required to pass the test. Write the code in the appropriate module within the
|
||||
project.
|
||||
|
||||
4. **Run the Test Again**: Execute the test again to ensure it passes.
|
||||
```bash
|
||||
python manage.py test
|
||||
```
|
||||
|
||||
5. **Refactor**: Refactor the code to improve its structure and readability while ensuring that all tests still pass.
|
||||
|
||||
6. **Repeat**: Repeat the process for each new feature or bug fix.
|
||||
|
||||
## Measuring Test Coverage
|
||||
|
||||
The project uses the `coverage` package to measure test coverage. To generate a coverage report, run the following
|
||||
command:
|
||||
|
||||
```bash
|
||||
coverage run --source='.' manage.py test
|
||||
coverage report
|
||||
```
|
||||
|
||||
## Additional Information
|
||||
|
||||
For more detailed information on the project structure and development modes, refer to the root level `README.md`.
|
|
@ -3,18 +3,20 @@ from prometheus_client.core import CounterMetricFamily, REGISTRY
|
|||
from django.db.models import Case, Value, When, BooleanField, Count
|
||||
from inventory.models import Item
|
||||
|
||||
|
||||
class ItemCountCollector(object):
|
||||
|
||||
def collect(self):
|
||||
counter = CounterMetricFamily("item_count", "Current number of items", labels=['event', 'returned_state'])
|
||||
try:
|
||||
counter = CounterMetricFamily("item_count", "Current number of items", labels=['event', 'returned_state'])
|
||||
|
||||
yield counter
|
||||
yield counter
|
||||
|
||||
if not apps.models_ready or not apps.apps_ready:
|
||||
return
|
||||
if not apps.models_ready or not apps.apps_ready:
|
||||
return
|
||||
|
||||
queryset = (
|
||||
Item.all_objects
|
||||
queryset = (
|
||||
Item.all_objects
|
||||
.annotate(
|
||||
returned=Case(
|
||||
When(returned_at__isnull=True, then=Value(False)),
|
||||
|
@ -25,11 +27,14 @@ class ItemCountCollector(object):
|
|||
.values('event__slug', 'returned', 'event_id')
|
||||
.annotate(amount=Count('id'))
|
||||
.order_by('event__slug', 'returned') # Optional: order by slug and returned
|
||||
)
|
||||
)
|
||||
|
||||
for e in queryset:
|
||||
counter.add_metric([e["event__slug"].lower(), str(e["returned"])], e["amount"])
|
||||
for e in queryset:
|
||||
counter.add_metric([e["event__slug"].lower(), str(e["returned"])], e["amount"])
|
||||
|
||||
yield counter
|
||||
except:
|
||||
pass
|
||||
|
||||
yield counter
|
||||
|
||||
REGISTRY.register(ItemCountCollector())
|
|
@ -1,6 +1,5 @@
|
|||
from django.core.files.base import ContentFile
|
||||
from django.db import models, IntegrityError
|
||||
from django_softdelete.models import SoftDeleteModel
|
||||
|
||||
from inventory.models import Item
|
||||
|
||||
|
@ -10,7 +9,8 @@ def hash_upload(instance, filename):
|
|||
|
||||
|
||||
class FileManager(models.Manager):
|
||||
def get_or_create(self, **kwargs):
|
||||
|
||||
def __file_data_helper(self, **kwargs):
|
||||
if 'data' in kwargs and type(kwargs['data']) == str:
|
||||
import base64
|
||||
from hashlib import sha256
|
||||
|
@ -31,6 +31,10 @@ class FileManager(models.Manager):
|
|||
pass
|
||||
else:
|
||||
raise ValueError('data must be a base64 encoded string or file and hash must be provided')
|
||||
return kwargs
|
||||
|
||||
def get_or_create(self, **kwargs):
|
||||
kwargs = self.__file_data_helper(**kwargs)
|
||||
try:
|
||||
return self.get(hash=kwargs['hash']), False
|
||||
except self.model.DoesNotExist:
|
||||
|
@ -39,26 +43,7 @@ class FileManager(models.Manager):
|
|||
return obj, True
|
||||
|
||||
def create(self, **kwargs):
|
||||
if 'data' in kwargs and type(kwargs['data']) == str:
|
||||
import base64
|
||||
from hashlib import sha256
|
||||
raw = kwargs['data']
|
||||
if not raw.startswith('data:'):
|
||||
raise ValueError('data must be a base64 encoded string or file and hash must be provided')
|
||||
raw = raw.split(';base64,')
|
||||
if len(raw) != 2:
|
||||
raise ValueError('data must be a base64 encoded string or file and hash must be provided')
|
||||
mime_type = raw[0].split(':')[1]
|
||||
content = base64.b64decode(raw[1], validate=True)
|
||||
kwargs.pop('data')
|
||||
content_hash = sha256(content).hexdigest()
|
||||
kwargs['file'] = ContentFile(content, content_hash)
|
||||
kwargs['hash'] = content_hash
|
||||
kwargs['mime_type'] = mime_type
|
||||
elif 'file' in kwargs and 'hash' in kwargs and type(kwargs['file']) == ContentFile and 'mime_type' in kwargs:
|
||||
pass
|
||||
else:
|
||||
raise ValueError('data must be a base64 encoded string or file and hash must be provided')
|
||||
kwargs = self.__file_data_helper(**kwargs)
|
||||
if not self.filter(hash=kwargs['hash']).exists():
|
||||
obj = super().create(**kwargs)
|
||||
obj.file.save(content=kwargs['file'], name=kwargs['hash'])
|
||||
|
|
|
@ -39,13 +39,61 @@ class ItemViewSet(viewsets.ModelViewSet):
|
|||
|
||||
def filter_items(items, query):
|
||||
query_tokens = query.split(' ')
|
||||
matches = []
|
||||
for item in items:
|
||||
value = 0
|
||||
if "I#" + str(item.id) in query:
|
||||
value += 12
|
||||
matches.append(
|
||||
{'type': 'item_id', 'text': f'is exactly {item.id} and matched "I#{item.id}"'})
|
||||
elif "#" + str(item.id) in query:
|
||||
value += 11
|
||||
matches.append(
|
||||
{'type': 'item_id', 'text': f'is exactly {item.id} and matched "#{item.id}"'})
|
||||
elif str(item.id) in query:
|
||||
value += 10
|
||||
matches.append({'type': 'item_id', 'text': f'is exactly {item.id}'})
|
||||
for issue in item.related_issues:
|
||||
if "T#" + issue.short_uuid() in query:
|
||||
value += 8
|
||||
matches.append({'type': 'ticket_uuid',
|
||||
'text': f'is exactly {issue.short_uuid()} and matched "T#{issue.short_uuid()}"'})
|
||||
elif "#" + issue.short_uuid() in query:
|
||||
value += 5
|
||||
matches.append({'type': 'ticket_uuid',
|
||||
'text': f'is exactly {issue.short_uuid()} and matched "#{issue.short_uuid()}"'})
|
||||
elif issue.short_uuid() in query:
|
||||
value += 3
|
||||
matches.append({'type': 'ticket_uuid', 'text': f'is exactly {issue.short_uuid()}'})
|
||||
if "T#" + str(issue.id) in query:
|
||||
value += 8
|
||||
matches.append({'type': 'ticket_id', 'text': f'is exactly {issue.id} and matched "T#{issue.id}"'})
|
||||
elif "#" + str(issue.id) in query:
|
||||
value += 5
|
||||
matches.append({'type': 'ticket_id', 'text': f'is exactly {issue.id} and matched "#{issue.id}"'})
|
||||
elif str(issue.id) in query:
|
||||
value += 3
|
||||
matches.append({'type': 'ticket_id', 'text': f'is exactly {issue.id}'})
|
||||
for comment in issue.comments.all():
|
||||
for token in query_tokens:
|
||||
if token in comment.comment:
|
||||
value += 1
|
||||
matches.append({'type': 'ticket_comment', 'text': f'contains {token}'})
|
||||
for token in query_tokens:
|
||||
if token in issue.name:
|
||||
value += 1
|
||||
matches.append({'type': 'ticket_name', 'text': f'contains {token}'})
|
||||
for token in query_tokens:
|
||||
if token in item.description:
|
||||
value += 1
|
||||
matches.append({'type': 'item_description', 'text': f'contains {token}'})
|
||||
for comment in item.comments.all():
|
||||
for token in query_tokens:
|
||||
if token in comment.comment:
|
||||
value += 1
|
||||
matches.append({'type': 'comment', 'text': f'contains {token}'})
|
||||
if value > 0:
|
||||
yield {'search_score': value, 'item': item}
|
||||
yield {'search_score': value, 'item': item, 'search_matches': matches}
|
||||
|
||||
|
||||
@api_view(['GET'])
|
||||
|
|
|
@ -1,6 +1,9 @@
|
|||
from itertools import groupby
|
||||
|
||||
from django.db import models
|
||||
from django.db.models.signals import pre_save
|
||||
from django.dispatch import receiver
|
||||
from django.utils import timezone
|
||||
from django_softdelete.models import SoftDeleteModel, SoftDeleteManager
|
||||
|
||||
|
||||
|
@ -64,6 +67,11 @@ class Item(SoftDeleteModel):
|
|||
return '[' + str(self.id) + ']' + self.description
|
||||
|
||||
|
||||
@receiver(pre_save, sender=Item)
|
||||
def item_updated(sender, instance, **kwargs):
|
||||
instance.updated_at = timezone.now()
|
||||
|
||||
|
||||
class Container(SoftDeleteModel):
|
||||
id = models.AutoField(primary_key=True)
|
||||
name = models.CharField(max_length=255)
|
||||
|
|
|
@ -132,15 +132,33 @@ class ItemSerializer(BasicItemSerializer):
|
|||
'cid': placement.container.id,
|
||||
'box': placement.container.name
|
||||
})
|
||||
|
||||
if obj.created_at:
|
||||
timeline.append({
|
||||
'type': 'created',
|
||||
'timestamp': obj.created_at,
|
||||
})
|
||||
if obj.returned_at:
|
||||
timeline.append({
|
||||
'type': 'returned',
|
||||
'timestamp': obj.returned_at,
|
||||
})
|
||||
if obj.deleted_at:
|
||||
timeline.append({
|
||||
'type': 'deleted',
|
||||
'timestamp': obj.deleted_at,
|
||||
})
|
||||
return sorted(timeline, key=lambda x: x['timestamp'])
|
||||
|
||||
|
||||
class SearchResultSerializer(serializers.Serializer):
|
||||
search_score = serializers.IntegerField()
|
||||
search_matches = serializers.ListField(child=serializers.DictField())
|
||||
item = ItemSerializer()
|
||||
|
||||
def to_representation(self, instance):
|
||||
return {**ItemSerializer(instance['item']).data, 'search_score': instance['search_score']}
|
||||
return {**ItemSerializer(instance['item']).data, 'search_score': instance['search_score'],
|
||||
'search_matches': instance['search_matches']}
|
||||
|
||||
class Meta:
|
||||
model = Item
|
||||
|
|
|
@ -63,28 +63,28 @@ class ItemTestCase(TestCase):
|
|||
self.assertEqual(response.json()[0]['file'], None)
|
||||
self.assertEqual(response.json()[0]['returned'], False)
|
||||
self.assertEqual(response.json()[0]['event'], self.event.slug)
|
||||
self.assertEqual(len(response.json()[0]['timeline']), 4)
|
||||
self.assertEqual(response.json()[0]['timeline'][0]['type'], 'placement')
|
||||
self.assertEqual(response.json()[0]['timeline'][1]['type'], 'comment')
|
||||
self.assertEqual(response.json()[0]['timeline'][2]['type'], 'issue_relation')
|
||||
self.assertEqual(response.json()[0]['timeline'][3]['type'], 'placement')
|
||||
self.assertEqual(response.json()[0]['timeline'][1]['id'], comment.id)
|
||||
self.assertEqual(response.json()[0]['timeline'][2]['id'], match.id)
|
||||
self.assertEqual(response.json()[0]['timeline'][3]['id'], placement.id)
|
||||
self.assertEqual(response.json()[0]['timeline'][0]['box'], 'BOX1')
|
||||
self.assertEqual(response.json()[0]['timeline'][0]['cid'], self.box1.id)
|
||||
self.assertEqual(response.json()[0]['timeline'][1]['comment'], 'test')
|
||||
self.assertEqual(response.json()[0]['timeline'][1]['timestamp'],
|
||||
comment.timestamp.strftime('%Y-%m-%dT%H:%M:%S.%fZ'))
|
||||
self.assertEqual(response.json()[0]['timeline'][2]['status'], 'possible')
|
||||
self.assertEqual(response.json()[0]['timeline'][2]['timestamp'],
|
||||
match.timestamp.strftime('%Y-%m-%dT%H:%M:%S.%fZ'))
|
||||
self.assertEqual(response.json()[0]['timeline'][2]['issue_thread']['name'], "test issue")
|
||||
self.assertEqual(response.json()[0]['timeline'][2]['issue_thread']['event'], "EVENT")
|
||||
self.assertEqual(response.json()[0]['timeline'][2]['issue_thread']['state'], "pending_new")
|
||||
self.assertEqual(response.json()[0]['timeline'][3]['box'], 'BOX2')
|
||||
self.assertEqual(response.json()[0]['timeline'][3]['cid'], self.box2.id)
|
||||
self.assertEqual(response.json()[0]['timeline'][3]['timestamp'],
|
||||
self.assertEqual(len(response.json()[0]['timeline']), 5)
|
||||
self.assertEqual(response.json()[0]['timeline'][0]['type'], 'created')
|
||||
self.assertEqual(response.json()[0]['timeline'][1]['type'], 'placement')
|
||||
self.assertEqual(response.json()[0]['timeline'][2]['type'], 'comment')
|
||||
self.assertEqual(response.json()[0]['timeline'][3]['type'], 'issue_relation')
|
||||
self.assertEqual(response.json()[0]['timeline'][4]['type'], 'placement')
|
||||
self.assertEqual(response.json()[0]['timeline'][2]['id'], comment.id)
|
||||
self.assertEqual(response.json()[0]['timeline'][3]['id'], match.id)
|
||||
self.assertEqual(response.json()[0]['timeline'][4]['id'], placement.id)
|
||||
self.assertEqual(response.json()[0]['timeline'][1]['box'], 'BOX1')
|
||||
self.assertEqual(response.json()[0]['timeline'][1]['cid'], self.box1.id)
|
||||
self.assertEqual(response.json()[0]['timeline'][0]['timestamp'], item.created_at.strftime('%Y-%m-%dT%H:%M:%S.%fZ'))
|
||||
self.assertEqual(response.json()[0]['timeline'][2]['comment'], 'test')
|
||||
self.assertEqual(response.json()[0]['timeline'][2]['timestamp'], comment.timestamp.strftime('%Y-%m-%dT%H:%M:%S.%fZ'))
|
||||
self.assertEqual(response.json()[0]['timeline'][3]['status'], 'possible')
|
||||
self.assertEqual(response.json()[0]['timeline'][3]['timestamp'], match.timestamp.strftime('%Y-%m-%dT%H:%M:%S.%fZ'))
|
||||
self.assertEqual(response.json()[0]['timeline'][3]['issue_thread']['name'], "test issue")
|
||||
self.assertEqual(response.json()[0]['timeline'][3]['issue_thread']['event'], "EVENT")
|
||||
self.assertEqual(response.json()[0]['timeline'][3]['issue_thread']['state'], "pending_new")
|
||||
self.assertEqual(response.json()[0]['timeline'][4]['box'], 'BOX2')
|
||||
self.assertEqual(response.json()[0]['timeline'][4]['cid'], self.box2.id)
|
||||
self.assertEqual(response.json()[0]['timeline'][4]['timestamp'],
|
||||
placement.timestamp.strftime('%Y-%m-%dT%H:%M:%S.%fZ'))
|
||||
self.assertEqual(len(response.json()[0]['related_issues']), 1)
|
||||
self.assertEqual(response.json()[0]['related_issues'][0]['name'], "test issue")
|
||||
|
|
|
@ -53,6 +53,12 @@ def unescape_simplified_quoted_printable(s, encoding='utf-8'):
|
|||
return quopri.decodestring(s).decode(encoding)
|
||||
|
||||
|
||||
def decode_inline_encodings(s):
|
||||
s = unescape_and_decode_quoted_printable(s)
|
||||
s = unescape_and_decode_base64(s)
|
||||
return s
|
||||
|
||||
|
||||
def ascii_strip(s):
|
||||
if not s:
|
||||
return None
|
||||
|
@ -87,17 +93,17 @@ def make_reply(reply_email, references=None, event=None):
|
|||
reply_email.save()
|
||||
if references:
|
||||
reply["References"] = " ".join(references)
|
||||
|
||||
reply.set_content(reply_email.body)
|
||||
|
||||
return reply
|
||||
|
||||
if reply_email.body != "":
|
||||
reply.set_content(reply_email.body)
|
||||
return reply
|
||||
else:
|
||||
raise SpecialMailException("mail content emty")
|
||||
|
||||
async def send_smtp(message):
|
||||
await aiosmtplib.send(message, hostname="127.0.0.1", port=25, use_tls=False, start_tls=False)
|
||||
|
||||
|
||||
def find_active_issue_thread(in_reply_to, address, subject, event):
|
||||
def find_active_issue_thread(in_reply_to, address, subject, event, spam=False):
|
||||
from re import match
|
||||
uuid_match = match(r'^ticket\+([a-f0-9-]{36})@', address)
|
||||
if uuid_match:
|
||||
|
@ -108,7 +114,8 @@ def find_active_issue_thread(in_reply_to, address, subject, event):
|
|||
if reply_to.exists():
|
||||
return reply_to.first().issue_thread, False
|
||||
else:
|
||||
issue = IssueThread.objects.create(name=subject, event=event)
|
||||
issue = IssueThread.objects.create(name=subject, event=event,
|
||||
initial_state='pending_suspected_spam' if spam else 'pending_new')
|
||||
return issue, True
|
||||
|
||||
|
||||
|
@ -128,10 +135,13 @@ def decode_email_segment(segment, charset, transfer_encoding):
|
|||
decode_as = 'cp1251'
|
||||
elif charset == 'iso-8859-1':
|
||||
decode_as = 'latin1'
|
||||
segment = unescape_and_decode_quoted_printable(segment)
|
||||
segment = unescape_and_decode_base64(segment)
|
||||
if transfer_encoding == 'quoted-printable':
|
||||
segment = unescape_simplified_quoted_printable(segment, decode_as)
|
||||
elif transfer_encoding == 'base64':
|
||||
import base64
|
||||
segment = base64.b64decode(segment).decode('utf-8')
|
||||
else:
|
||||
segment = decode_inline_encodings(segment.decode('utf-8'))
|
||||
return segment
|
||||
|
||||
|
||||
|
@ -156,7 +166,7 @@ def parse_email_body(raw, log=None):
|
|||
segment = part.get_payload()
|
||||
if not segment:
|
||||
continue
|
||||
segment = decode_email_segment(segment, charset, part.get('Content-Transfer-Encoding'))
|
||||
segment = decode_email_segment(segment.encode('utf-8'), charset, part.get('Content-Transfer-Encoding'))
|
||||
log.debug(segment)
|
||||
body = body + segment
|
||||
elif 'attachment' in cdispo or 'inline' in cdispo:
|
||||
|
@ -189,7 +199,8 @@ def parse_email_body(raw, log=None):
|
|||
else:
|
||||
log.warning("Unknown content type %s", parsed.get_content_type())
|
||||
body = "Unknown content type"
|
||||
body = decode_email_segment(body, parsed.get_content_charset(), parsed.get('Content-Transfer-Encoding'))
|
||||
body = decode_email_segment(body.encode('utf-8'), parsed.get_content_charset(),
|
||||
parsed.get('Content-Transfer-Encoding'))
|
||||
log.debug(body)
|
||||
|
||||
return parsed, body, attachments
|
||||
|
@ -203,6 +214,8 @@ def receive_email(envelope, log=None):
|
|||
header_to = parsed.get('To')
|
||||
header_in_reply_to = ascii_strip(parsed.get('In-Reply-To'))
|
||||
header_message_id = ascii_strip(parsed.get('Message-ID'))
|
||||
maybe_spam = parsed.get('X-Spam')
|
||||
suspected_spam = (maybe_spam and maybe_spam.lower() == 'yes')
|
||||
|
||||
if match(r'^([a-zA-Z ]*<)?MAILER-DAEMON@', header_from) and envelope.mail_from.strip("<>") == "":
|
||||
log.warning("Ignoring mailer daemon")
|
||||
|
@ -210,18 +223,20 @@ def receive_email(envelope, log=None):
|
|||
|
||||
if Email.objects.filter(reference=header_message_id).exists(): # break before issue thread is created
|
||||
log.warning("Email already exists")
|
||||
raise Exception("Email already exists")
|
||||
raise SpecialMailException("Email already exists")
|
||||
|
||||
recipient = envelope.rcpt_tos[0].lower() if envelope.rcpt_tos else header_to.lower()
|
||||
sender = envelope.mail_from if envelope.mail_from else header_from
|
||||
subject = ascii_strip(parsed.get('Subject'))
|
||||
if not subject:
|
||||
subject = "No subject"
|
||||
subject = unescape_and_decode_quoted_printable(subject)
|
||||
subject = unescape_and_decode_base64(subject)
|
||||
subject = decode_inline_encodings(subject)
|
||||
recipient = decode_inline_encodings(recipient)
|
||||
sender = decode_inline_encodings(sender)
|
||||
target_event = find_target_event(recipient)
|
||||
|
||||
active_issue_thread, new = find_active_issue_thread(header_in_reply_to, recipient, subject, target_event)
|
||||
active_issue_thread, new = find_active_issue_thread(
|
||||
header_in_reply_to, recipient, subject, target_event, suspected_spam)
|
||||
|
||||
from hashlib import sha256
|
||||
random_filename = 'mail-' + sha256(envelope.content).hexdigest()
|
||||
|
@ -239,7 +254,7 @@ def receive_email(envelope, log=None):
|
|||
if new:
|
||||
# auto reply if new issue
|
||||
references = collect_references(active_issue_thread)
|
||||
if not sender.startswith('noreply'):
|
||||
if not sender.startswith('noreply') and not sender.startswith('no-reply') and not suspected_spam:
|
||||
subject = f"Re: {subject} [#{active_issue_thread.short_uuid()}]"
|
||||
body = '''Your request (#{}) has been received and will be reviewed by our lost&found angels.
|
||||
|
||||
|
@ -252,7 +267,7 @@ do not create a new request.
|
|||
|
||||
Your c3lf (Cloakroom + Lost&Found) Team'''.format(active_issue_thread.short_uuid())
|
||||
reply_email = Email.objects.create(
|
||||
sender=recipient, recipient=sender, body=body, subject=ascii_strip(subject),
|
||||
sender=recipient, recipient=sender, body=body, subject=subject,
|
||||
in_reply_to=header_message_id, event=target_event, issue_thread=active_issue_thread)
|
||||
reply = make_reply(reply_email, references, event=target_event.slug if target_event else None)
|
||||
else:
|
||||
|
@ -288,10 +303,10 @@ class LMTPHandler:
|
|||
systemevent = await database_sync_to_async(SystemEvent.objects.create)(type='email received',
|
||||
reference=email.id)
|
||||
log.info(f"Created system event {systemevent.id}")
|
||||
channel_layer = get_channel_layer()
|
||||
await channel_layer.group_send(
|
||||
'general', {"type": "generic.event", "name": "send_message_to_frontend", "event_id": systemevent.id,
|
||||
"message": "email received"})
|
||||
#channel_layer = get_channel_layer()
|
||||
#await channel_layer.group_send(
|
||||
# 'general', {"type": "generic.event", "name": "send_message_to_frontend", "event_id": systemevent.id,
|
||||
# "message": "email received"})
|
||||
log.info(f"Sent message to frontend")
|
||||
if new and reply:
|
||||
log.info('Sending message to %s' % reply['To'])
|
||||
|
|
|
@ -165,7 +165,7 @@ class LMTPHandlerTestCase(TestCase): # TODO replace with less hacky test
|
|||
self.assertEqual('Text mit Quoted-Printable-Kodierung: äöüß', Email.objects.all()[0].body)
|
||||
self.assertTrue(Email.objects.all()[0].raw_file.path)
|
||||
|
||||
def test_handle_base64(self):
|
||||
def test_handle_base64_inline(self):
|
||||
from aiosmtpd.smtp import Envelope
|
||||
from asgiref.sync import async_to_sync
|
||||
import aiosmtplib
|
||||
|
@ -186,6 +186,35 @@ class LMTPHandlerTestCase(TestCase): # TODO replace with less hacky test
|
|||
self.assertEqual('Text mit Base64-Kodierung: äöüß', Email.objects.all()[0].body)
|
||||
self.assertTrue(Email.objects.all()[0].raw_file.path)
|
||||
|
||||
def test_handle_base64_transfer_encoding(self):
|
||||
from aiosmtpd.smtp import Envelope
|
||||
from asgiref.sync import async_to_sync
|
||||
import aiosmtplib
|
||||
aiosmtplib.send = make_mocked_coro()
|
||||
handler = LMTPHandler()
|
||||
server = mock.Mock()
|
||||
session = mock.Mock()
|
||||
envelope = Envelope()
|
||||
envelope.mail_from = 'test1@test'
|
||||
envelope.rcpt_tos = ['test2@test']
|
||||
envelope.content = b'''Subject: test
|
||||
From: test3@test
|
||||
To: test4@test
|
||||
Message-ID: <1@test>
|
||||
Content-Type: text/plain; charset=utf-8
|
||||
Content-Transfer-Encoding: base64
|
||||
|
||||
VGVzdCBtaXQgQmFzZTY0LUtvZGllcnVuZzogw6TDtsO8w58='''
|
||||
|
||||
result = async_to_sync(handler.handle_DATA)(server, session, envelope)
|
||||
self.assertEqual(result, '250 Message accepted for delivery')
|
||||
self.assertEqual(len(Email.objects.all()), 2)
|
||||
self.assertEqual(len(IssueThread.objects.all()), 1)
|
||||
aiosmtplib.send.assert_called_once()
|
||||
self.assertEqual('test', Email.objects.all()[0].subject)
|
||||
self.assertEqual('Test mit Base64-Kodierung: äöüß', Email.objects.all()[0].body)
|
||||
self.assertTrue(Email.objects.all()[0].raw_file.path)
|
||||
|
||||
def test_handle_client_reply(self):
|
||||
issue_thread = IssueThread.objects.create(
|
||||
name="test",
|
||||
|
@ -783,6 +812,44 @@ dGVzdGltYWdl
|
|||
self.assertEqual(None, IssueThread.objects.all()[0].assigned_to)
|
||||
aiosmtplib.send.assert_called_once()
|
||||
|
||||
def test_mail_spam_header(self):
|
||||
from aiosmtpd.smtp import Envelope
|
||||
from asgiref.sync import async_to_sync
|
||||
import aiosmtplib
|
||||
aiosmtplib.send = make_mocked_coro()
|
||||
handler = LMTPHandler()
|
||||
server = mock.Mock()
|
||||
session = mock.Mock()
|
||||
envelope = Envelope()
|
||||
envelope.mail_from = 'test1@test'
|
||||
envelope.rcpt_tos = ['test2@test']
|
||||
envelope.content = b'''Subject: test
|
||||
From: test1@test
|
||||
To: test2@test
|
||||
Message-ID: <1@test>
|
||||
X-Spam: Yes
|
||||
|
||||
test'''
|
||||
result = async_to_sync(handler.handle_DATA)(server, session, envelope)
|
||||
|
||||
self.assertEqual(result, '250 Message accepted for delivery')
|
||||
self.assertEqual(len(Email.objects.all()), 1) # do not send auto reply if spam is suspected
|
||||
self.assertEqual(len(IssueThread.objects.all()), 1)
|
||||
aiosmtplib.send.assert_not_called()
|
||||
self.assertEqual('test', Email.objects.all()[0].subject)
|
||||
self.assertEqual('test1@test', Email.objects.all()[0].sender)
|
||||
self.assertEqual('test2@test', Email.objects.all()[0].recipient)
|
||||
self.assertEqual('test', Email.objects.all()[0].body)
|
||||
self.assertEqual(IssueThread.objects.all()[0], Email.objects.all()[0].issue_thread)
|
||||
self.assertEqual('<1@test>', Email.objects.all()[0].reference)
|
||||
self.assertEqual(None, Email.objects.all()[0].in_reply_to)
|
||||
self.assertEqual('test', IssueThread.objects.all()[0].name)
|
||||
self.assertEqual('pending_suspected_spam', IssueThread.objects.all()[0].state)
|
||||
self.assertEqual(None, IssueThread.objects.all()[0].assigned_to)
|
||||
states = StateChange.objects.filter(issue_thread=IssueThread.objects.all()[0])
|
||||
self.assertEqual(1, len(states))
|
||||
self.assertEqual('pending_suspected_spam', states[0].state)
|
||||
|
||||
def test_mail_4byte_unicode_emoji(self):
|
||||
from aiosmtpd.smtp import Envelope
|
||||
from asgiref.sync import async_to_sync
|
||||
|
|
|
@ -13,7 +13,7 @@ Automat==22.10.0
|
|||
beautifulsoup4==4.12.2
|
||||
bs4==0.0.1
|
||||
certifi==2023.11.17
|
||||
cffi==1.16.0
|
||||
#cffi==1.16.0
|
||||
channels==4.0.0
|
||||
channels-redis==4.1.0
|
||||
charset-normalizer==3.3.2
|
||||
|
@ -40,12 +40,12 @@ inflection==0.5.1
|
|||
itypes==1.2.0
|
||||
Jinja2==3.1.2
|
||||
MarkupSafe==2.1.3
|
||||
msgpack==1.0.7
|
||||
msgpack-python==0.5.6
|
||||
#msgpack==1.0.7
|
||||
#msgpack-python==0.5.6
|
||||
multidict==6.0.5
|
||||
openapi-codec==1.3.2
|
||||
packaging==23.2
|
||||
Pillow==10.1.0
|
||||
Pillow==11.1.0
|
||||
pyasn1==0.5.1
|
||||
pyasn1-modules==0.3.0
|
||||
pycares==4.4.0
|
||||
|
@ -69,7 +69,6 @@ typing_extensions==4.8.0
|
|||
uritemplate==4.1.1
|
||||
urllib3==2.1.0
|
||||
uvicorn==0.24.0.post1
|
||||
watchfiles==0.21.0
|
||||
websockets==12.0
|
||||
yarl==1.9.4
|
||||
zope.interface==6.1
|
||||
|
|
|
@ -102,12 +102,6 @@ def manual_ticket(request, event_slug):
|
|||
subject=request.data['name'],
|
||||
body=request.data['body'],
|
||||
)
|
||||
systemevent = SystemEvent.objects.create(type='email received', reference=email.id)
|
||||
channel_layer = get_channel_layer()
|
||||
async_to_sync(channel_layer.group_send)(
|
||||
'general', {"type": "generic.event", "name": "send_message_to_frontend", "event_id": systemevent.id,
|
||||
"message": "email received"}
|
||||
)
|
||||
|
||||
return Response(IssueSerializer(issue).data, status=status.HTTP_201_CREATED)
|
||||
|
||||
|
@ -133,48 +127,75 @@ def add_comment(request, pk):
|
|||
issue_thread=issue,
|
||||
comment=request.data['comment'],
|
||||
)
|
||||
systemevent = SystemEvent.objects.create(type='comment added', reference=comment.id)
|
||||
channel_layer = get_channel_layer()
|
||||
async_to_sync(channel_layer.group_send)(
|
||||
'general', {"type": "generic.event", "name": "send_message_to_frontend", "event_id": systemevent.id,
|
||||
"message": "comment added"}
|
||||
)
|
||||
return Response(CommentSerializer(comment).data, status=status.HTTP_201_CREATED)
|
||||
|
||||
|
||||
def filter_issues(issues, query):
|
||||
query_tokens = query.lower().split(' ')
|
||||
matches = []
|
||||
for issue in issues:
|
||||
value = 0
|
||||
if issue.short_uuid() in query:
|
||||
if "T#" + issue.short_uuid() in query:
|
||||
value += 12
|
||||
matches.append(
|
||||
{'type': 'ticket_uuid', 'text': f'is exactly {issue.short_uuid()} and matched "T#{issue.short_uuid()}"'})
|
||||
elif "#" + issue.short_uuid() in query:
|
||||
value += 11
|
||||
matches.append(
|
||||
{'type': 'ticket_uuid', 'text': f'is exactly {issue.short_uuid()} and matched "#{issue.short_uuid()}"'})
|
||||
elif issue.short_uuid() in query:
|
||||
value += 10
|
||||
matches.append({'type': 'ticket_uuid', 'text': f'is exactly {issue.short_uuid()}'})
|
||||
if "T#" + str(issue.id) in query:
|
||||
value += 10
|
||||
matches.append({'type': 'ticket_id', 'text': f'is exactly {issue.id} and matched "T#{issue.id}"'})
|
||||
elif "#" + str(issue.id) in query:
|
||||
value += 9
|
||||
value += 7
|
||||
matches.append({'type': 'ticket_id', 'text': f'is exactly {issue.id} and matched "#{issue.id}"'})
|
||||
elif str(issue.id) in query:
|
||||
value += 4
|
||||
matches.append({'type': 'ticket_id', 'text': f'is exactly {issue.id}'})
|
||||
for item in issue.related_items:
|
||||
if "I#" + str(item.id) in query:
|
||||
value += 8
|
||||
matches.append({'type': 'item_id', 'text': f'is exactly {item.id} and matched "I#{item.id}"'})
|
||||
elif "#" + str(item.id) in query:
|
||||
value += 5
|
||||
matches.append({'type': 'item_id', 'text': f'is exactly {item.id} and matched "#{item.id}"'})
|
||||
elif str(item.id) in query:
|
||||
value += 3
|
||||
matches.append({'type': 'item_id', 'text': f'is exactly {item.id}'})
|
||||
for token in query_tokens:
|
||||
if token in item.description.lower():
|
||||
value += 1
|
||||
matches.append({'type': 'item_description', 'text': f'contains {token}'})
|
||||
for comment in item.comments.all():
|
||||
for token in query_tokens:
|
||||
if token in comment.comment.lower():
|
||||
value += 1
|
||||
matches.append({'type': 'item_comment', 'text': f'contains {token}'})
|
||||
for token in query_tokens:
|
||||
if token in issue.name.lower():
|
||||
value += 1
|
||||
matches.append({'type': 'ticket_name', 'text': f'contains {token}'})
|
||||
for comment in issue.comments.all():
|
||||
for token in query_tokens:
|
||||
if token in comment.comment.lower():
|
||||
value += 1
|
||||
matches.append({'type': 'ticket_comment', 'text': f'contains {token}'})
|
||||
for email in issue.emails.all():
|
||||
for token in query_tokens:
|
||||
if token in email.subject.lower():
|
||||
value += 1
|
||||
matches.append({'type': 'email_subject', 'text': f'contains {token}'})
|
||||
if token in email.body.lower():
|
||||
value += 1
|
||||
matches.append({'type': 'email_body', 'text': f'contains {token}'})
|
||||
if token in email.sender.lower():
|
||||
value += 1
|
||||
matches.append({'type': 'email_sender', 'text': f'contains {token}'})
|
||||
if value > 0:
|
||||
yield {'search_score': value, 'issue': issue}
|
||||
yield {'search_score': value, 'issue': issue, 'search_matches': matches}
|
||||
|
||||
|
||||
@api_view(['GET'])
|
||||
|
|
18
core/tickets/migrations/0013_alter_statechange_state.py
Normal file
18
core/tickets/migrations/0013_alter_statechange_state.py
Normal file
|
@ -0,0 +1,18 @@
|
|||
# Generated by Django 4.2.7 on 2025-03-15 21:31
|
||||
|
||||
from django.db import migrations, models
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('tickets', '0012_remove_issuethread_related_items_and_more'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AlterField(
|
||||
model_name='statechange',
|
||||
name='state',
|
||||
field=models.CharField(choices=[('pending_new', 'New'), ('pending_open', 'Open'), ('pending_shipping', 'Needs to be shipped'), ('pending_physical_confirmation', 'Needs to be confirmed physically'), ('pending_return', 'Needs to be returned'), ('pending_postponed', 'Postponed'), ('pending_suspected_spam', 'Suspected Spam'), ('waiting_details', 'Waiting for details'), ('waiting_pre_shipping', 'Waiting for Address/Shipping Info'), ('closed_returned', 'Closed: Returned'), ('closed_shipped', 'Closed: Shipped'), ('closed_not_found', 'Closed: Not found'), ('closed_not_our_problem', 'Closed: Not our problem'), ('closed_duplicate', 'Closed: Duplicate'), ('closed_timeout', 'Closed: Timeout'), ('closed_spam', 'Closed: Spam'), ('closed_nothing_missing', 'Closed: Nothing missing'), ('closed_wtf', 'Closed: WTF'), ('found_open', 'Item Found and stored externally'), ('found_closed', 'Item Found and stored externally and closed')], default='pending_new', max_length=255),
|
||||
),
|
||||
]
|
|
@ -16,6 +16,7 @@ STATE_CHOICES = (
|
|||
('pending_physical_confirmation', 'Needs to be confirmed physically'),
|
||||
('pending_return', 'Needs to be returned'),
|
||||
('pending_postponed', 'Postponed'),
|
||||
('pending_suspected_spam', 'Suspected Spam'),
|
||||
('waiting_details', 'Waiting for details'),
|
||||
('waiting_pre_shipping', 'Waiting for Address/Shipping Info'),
|
||||
('closed_returned', 'Closed: Returned'),
|
||||
|
@ -46,6 +47,11 @@ class IssueThread(SoftDeleteModel):
|
|||
event = models.ForeignKey(Event, null=True, on_delete=models.SET_NULL, related_name='issue_threads')
|
||||
manually_created = models.BooleanField(default=False)
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
if 'initial_state' in kwargs:
|
||||
self._initial_state = kwargs.pop('initial_state')
|
||||
super().__init__(*args, **kwargs)
|
||||
|
||||
def short_uuid(self):
|
||||
return self.uuid[:8]
|
||||
|
||||
|
@ -110,8 +116,9 @@ def set_uuid(sender, instance, **kwargs):
|
|||
|
||||
@receiver(post_save, sender=IssueThread)
|
||||
def create_issue_thread(sender, instance, created, **kwargs):
|
||||
if created:
|
||||
StateChange.objects.create(issue_thread=instance, state='pending_new')
|
||||
if created and instance.state_changes.count() == 0:
|
||||
initial_state = getattr(instance, '_initial_state', None)
|
||||
StateChange.objects.create(issue_thread=instance, state=initial_state if initial_state else 'pending_new')
|
||||
|
||||
|
||||
class Comment(models.Model):
|
||||
|
|
|
@ -139,10 +139,12 @@ class IssueSerializer(BasicIssueSerializer):
|
|||
|
||||
class SearchResultSerializer(serializers.Serializer):
|
||||
search_score = serializers.IntegerField()
|
||||
search_matches = serializers.ListField(child=serializers.DictField())
|
||||
issue = IssueSerializer()
|
||||
|
||||
def to_representation(self, instance):
|
||||
return {**IssueSerializer(instance['issue']).data, 'search_score': instance['search_score']}
|
||||
return {**IssueSerializer(instance['issue']).data, 'search_score': instance['search_score'],
|
||||
'search_matches': instance['search_matches']}
|
||||
|
||||
class Meta:
|
||||
model = IssueThread
|
||||
|
|
|
@ -9,6 +9,7 @@ class RelationSerializer(serializers.ModelSerializer):
|
|||
class Meta:
|
||||
model = ItemRelation
|
||||
fields = ('id', 'status', 'timestamp', 'item', 'issue_thread')
|
||||
read_only_fields = ('id', 'timestamp')
|
||||
|
||||
|
||||
class BasicIssueSerializer(serializers.ModelSerializer):
|
||||
|
|
|
@ -4,6 +4,7 @@ from django.test import TestCase, Client
|
|||
|
||||
from authentication.models import ExtendedUser
|
||||
from inventory.models import Event, Container, Item
|
||||
from inventory.models import Comment as ItemComment
|
||||
from mail.models import Email, EmailAttachment
|
||||
from tickets.models import IssueThread, StateChange, Comment, ItemRelation, Assignment
|
||||
from django.contrib.auth.models import Permission
|
||||
|
@ -407,16 +408,16 @@ class IssueSearchTest(TestCase):
|
|||
mail1 = Email.objects.create(
|
||||
subject='test',
|
||||
body='test aBc',
|
||||
sender='test',
|
||||
recipient='test',
|
||||
sender='bar@test',
|
||||
recipient='2@test',
|
||||
issue_thread=issue,
|
||||
timestamp=now,
|
||||
)
|
||||
mail2 = Email.objects.create(
|
||||
subject='test',
|
||||
subject='Re: test',
|
||||
body='test',
|
||||
sender='test',
|
||||
recipient='test',
|
||||
sender='2@test',
|
||||
recipient='1@test',
|
||||
issue_thread=issue,
|
||||
in_reply_to=mail1.reference,
|
||||
timestamp=now + timedelta(seconds=2),
|
||||
|
@ -436,6 +437,11 @@ class IssueSearchTest(TestCase):
|
|||
item=self.item,
|
||||
timestamp=now + timedelta(seconds=5),
|
||||
)
|
||||
item_comment = ItemComment.objects.create(
|
||||
item=self.item,
|
||||
comment="baz",
|
||||
timestamp=now + timedelta(seconds=6),
|
||||
)
|
||||
search_query = b64encode(b'abC').decode('utf-8')
|
||||
response = self.client.get(f'/api/2/{self.event.slug}/tickets/{search_query}/')
|
||||
self.assertEqual(200, response.status_code)
|
||||
|
@ -465,3 +471,21 @@ class IssueSearchTest(TestCase):
|
|||
self.assertGreater(score3, score2)
|
||||
self.assertGreater(score2, score1)
|
||||
self.assertGreater(score1, 0)
|
||||
|
||||
search_query = b64encode(b'foo').decode('utf-8')
|
||||
response = self.client.get(f'/api/2/{self.event.slug}/tickets/{search_query}/')
|
||||
self.assertEqual(200, response.status_code)
|
||||
self.assertEqual(1, len(response.json()))
|
||||
self.assertEqual(issue.id, response.json()[0]['id'])
|
||||
|
||||
search_query = b64encode(b'bar').decode('utf-8')
|
||||
response = self.client.get(f'/api/2/{self.event.slug}/tickets/{search_query}/')
|
||||
self.assertEqual(200, response.status_code)
|
||||
self.assertEqual(1, len(response.json()))
|
||||
self.assertEqual(issue.id, response.json()[0]['id'])
|
||||
|
||||
search_query = b64encode(b'baz').decode('utf-8')
|
||||
response = self.client.get(f'/api/2/{self.event.slug}/tickets/{search_query}/')
|
||||
self.assertEqual(200, response.status_code)
|
||||
self.assertEqual(1, len(response.json()))
|
||||
self.assertEqual(issue.id, response.json()[0]['id'])
|
||||
|
|
|
@ -1,13 +1,8 @@
|
|||
FROM python:3.11-bookworm
|
||||
FROM python:3.11-slim-bookworm
|
||||
LABEL authors="lagertonne"
|
||||
|
||||
ENV PYTHONUNBUFFERED 1
|
||||
RUN mkdir /code
|
||||
WORKDIR /code
|
||||
COPY requirements.dev.txt /code/
|
||||
COPY requirements.prod.txt /code/
|
||||
RUN apt update && apt install -y mariadb-client
|
||||
RUN pip install -r requirements.dev.txt
|
||||
RUN pip install -r requirements.prod.txt
|
||||
RUN pip install mysqlclient
|
||||
COPY .. /code/
|
|
@ -1,4 +1,4 @@
|
|||
FROM docker.io/node:22
|
||||
FROM node:22-alpine
|
||||
|
||||
RUN mkdir /web
|
||||
WORKDIR /web
|
||||
|
|
|
@ -1,3 +1,4 @@
|
|||
name: c3lf-sys3-dev
|
||||
services:
|
||||
core:
|
||||
build:
|
||||
|
@ -6,10 +7,12 @@ services:
|
|||
command: bash -c 'python manage.py migrate && python testdata.py && python manage.py runserver 0.0.0.0:8000'
|
||||
environment:
|
||||
- HTTP_HOST=core
|
||||
- DB_FILE=dev.db
|
||||
- DB_FILE=.local/dev.db
|
||||
- DEBUG_MODE_ACTIVE=true
|
||||
volumes:
|
||||
- ../../core:/code
|
||||
- ../testdata.py:/code/testdata.py
|
||||
- ../../core:/code:ro
|
||||
- ../testdata.py:/code/testdata.py:ro
|
||||
- backend_context:/code/.local
|
||||
ports:
|
||||
- "8000:8000"
|
||||
|
||||
|
@ -19,10 +22,12 @@ services:
|
|||
dockerfile: ../deploy/dev/Dockerfile.frontend
|
||||
command: npm run serve
|
||||
volumes:
|
||||
- ../../web:/web:ro
|
||||
- /web/node_modules
|
||||
- ../../web/src:/web/src
|
||||
- ./vue.config.js:/web/vue.config.js
|
||||
ports:
|
||||
- "8080:8080"
|
||||
depends_on:
|
||||
- core
|
||||
|
||||
volumes:
|
||||
backend_context:
|
|
@ -1,11 +1,11 @@
|
|||
FROM python:3.11-bookworm
|
||||
FROM python:3.11-slim-bookworm
|
||||
LABEL authors="lagertonne"
|
||||
|
||||
ENV PYTHONUNBUFFERED 1
|
||||
RUN mkdir /code
|
||||
WORKDIR /code
|
||||
COPY requirements.prod.txt /code/
|
||||
RUN apt update && apt install -y mariadb-client
|
||||
RUN pip install -r requirements.prod.txt
|
||||
RUN apt update && apt install -y pkg-config mariadb-client default-libmysqlclient-dev build-essential
|
||||
RUN pip install mysqlclient
|
||||
COPY requirements.prod.txt /code/
|
||||
RUN pip install -r requirements.prod.txt
|
||||
COPY .. /code/
|
|
@ -1,4 +1,4 @@
|
|||
FROM docker.io/node:22
|
||||
FROM node:22-alpine
|
||||
|
||||
RUN mkdir /web
|
||||
WORKDIR /web
|
||||
|
|
|
@ -1,3 +1,4 @@
|
|||
name: c3lf-sys3-testing
|
||||
services:
|
||||
redis:
|
||||
image: redis
|
||||
|
@ -31,8 +32,9 @@ services:
|
|||
- DB_PASSWORD=system3
|
||||
- MAIL_DOMAIN=mail:1025
|
||||
volumes:
|
||||
- ../../core:/code
|
||||
- ../testdata.py:/code/testdata.py
|
||||
- ../../core:/code:ro
|
||||
- ../testdata.py:/code/testdata.py:ro
|
||||
- backend_context:/code
|
||||
ports:
|
||||
- "8000:8000"
|
||||
depends_on:
|
||||
|
@ -47,8 +49,8 @@ services:
|
|||
command: npm run serve
|
||||
volumes:
|
||||
- ../../web:/web:ro
|
||||
- /web/node_modules
|
||||
- ./vue.config.js:/web/vue.config.js
|
||||
- ./vue.config.js:/web/vue.config.js:ro
|
||||
- frontend_context:/web
|
||||
ports:
|
||||
- "8080:8080"
|
||||
depends_on:
|
||||
|
@ -70,3 +72,5 @@ services:
|
|||
volumes:
|
||||
mariadb_data:
|
||||
mailpit_data:
|
||||
frontend_context:
|
||||
backend_context:
|
||||
|
|
9399
web/package-lock.json
generated
Normal file
9399
web/package-lock.json
generated
Normal file
File diff suppressed because it is too large
Load diff
|
@ -24,6 +24,15 @@
|
|||
<span class="timeline-item-icon faded-icon" v-else-if="item.type === 'placement'">
|
||||
<font-awesome-icon icon="archive"/>
|
||||
</span>
|
||||
<span class="timeline-item-icon faded-icon" v-else-if="item.type === 'created'">
|
||||
<font-awesome-icon icon="archive"/>
|
||||
</span>
|
||||
<span class="timeline-item-icon faded-icon" v-else-if="item.type === 'returned'">
|
||||
<font-awesome-icon icon="archive"/>
|
||||
</span>
|
||||
<span class="timeline-item-icon faded-icon" v-else-if="item.type === 'deleted'">
|
||||
<font-awesome-icon icon="trash"/>
|
||||
</span>
|
||||
<span class="timeline-item-icon faded-icon" v-else>
|
||||
<font-awesome-icon icon="pen"/>
|
||||
</span>
|
||||
|
@ -35,6 +44,9 @@
|
|||
<TimelineShippingVoucher v-else-if="item.type === 'shipping_voucher'" :item="item"/>
|
||||
<TimelinePlacement v-else-if="item.type === 'placement'" :item="item"/>
|
||||
<TimelineRelatedTicket v-else-if="item.type === 'issue_relation'" :item="item"/>
|
||||
<TimelineCreated v-else-if="item.type === 'created'" :item="item"/>
|
||||
<TimelineReturned v-else-if="item.type === 'returned'" :item="item"/>
|
||||
<TimelineDeleted v-else-if="item.type === 'deleted'" :item="item"/>
|
||||
<p v-else>{{ item }}</p>
|
||||
</li>
|
||||
<li class="timeline-item">
|
||||
|
@ -58,10 +70,16 @@ import TimelineShippingVoucher from "@/components/TimelineShippingVoucher.vue";
|
|||
import AsyncButton from "@/components/inputs/AsyncButton.vue";
|
||||
import TimelinePlacement from "@/components/TimelinePlacement.vue";
|
||||
import TimelineRelatedTicket from "@/components/TimelineRelatedTicket.vue";
|
||||
import TimelineCreated from "@/components/TimelineCreated.vue";
|
||||
import TimelineReturned from "@/components/TimelineReturned.vue";
|
||||
import TimelineDeleted from "@/components/TimelineDeleted.vue";
|
||||
|
||||
export default {
|
||||
name: 'Timeline',
|
||||
components: {
|
||||
TimelineDeleted,
|
||||
TimelineReturned,
|
||||
TimelineCreated,
|
||||
TimelineRelatedTicket,
|
||||
TimelinePlacement,
|
||||
TimelineShippingVoucher,
|
||||
|
|
83
web/src/components/TimelineCreated.vue
Normal file
83
web/src/components/TimelineCreated.vue
Normal file
|
@ -0,0 +1,83 @@
|
|||
<template>
|
||||
<div class="timeline-item-description"><span>created by
|
||||
<i class="avatar | small">
|
||||
<font-awesome-icon icon="user"/>
|
||||
</i>
|
||||
<a href="#">$USER</a> at <time :datetime="timestamp">{{ timestamp }}</time></span>
|
||||
</div>
|
||||
</template>
|
||||
|
||||
<script>
|
||||
|
||||
import {mapState} from "vuex";
|
||||
|
||||
export default {
|
||||
name: 'TimelineCreated',
|
||||
props: {
|
||||
'item': {
|
||||
type: Object,
|
||||
required: true
|
||||
}
|
||||
},
|
||||
computed: {
|
||||
'timestamp': function () {
|
||||
return new Date(this.item.timestamp).toLocaleString();
|
||||
},
|
||||
|
||||
}
|
||||
};
|
||||
</script>
|
||||
|
||||
<style scoped>
|
||||
|
||||
|
||||
a {
|
||||
color: inherit;
|
||||
}
|
||||
|
||||
|
||||
.timeline-item-description {
|
||||
display: flex;
|
||||
padding-top: 6px;
|
||||
gap: 8px;
|
||||
color: var(--gray);
|
||||
|
||||
img {
|
||||
flex-shrink: 0;
|
||||
}
|
||||
|
||||
a {
|
||||
/*color: var(--c-grey-500);*/
|
||||
font-weight: 500;
|
||||
text-decoration: none;
|
||||
|
||||
&:hover,
|
||||
&:focus {
|
||||
outline: 0; /* Don't actually do this */
|
||||
color: var(--info);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
.avatar {
|
||||
display: inline-flex;
|
||||
align-items: center;
|
||||
justify-content: center;
|
||||
border-radius: 50%;
|
||||
overflow: hidden;
|
||||
aspect-ratio: 1 / 1;
|
||||
flex-shrink: 0;
|
||||
width: 40px;
|
||||
height: 40px;
|
||||
|
||||
&.small {
|
||||
width: 28px;
|
||||
height: 28px;
|
||||
}
|
||||
|
||||
img {
|
||||
object-fit: cover;
|
||||
}
|
||||
}
|
||||
|
||||
</style>
|
83
web/src/components/TimelineDeleted.vue
Normal file
83
web/src/components/TimelineDeleted.vue
Normal file
|
@ -0,0 +1,83 @@
|
|||
<template>
|
||||
<div class="timeline-item-description"><span>marked deleted by
|
||||
<i class="avatar | small">
|
||||
<font-awesome-icon icon="user"/>
|
||||
</i>
|
||||
<a href="#">$USER</a> at <time :datetime="timestamp">{{ timestamp }}</time></span>
|
||||
</div>
|
||||
</template>
|
||||
|
||||
<script>
|
||||
|
||||
import {mapState} from "vuex";
|
||||
|
||||
export default {
|
||||
name: 'TimelineDeleted',
|
||||
props: {
|
||||
'item': {
|
||||
type: Object,
|
||||
required: true
|
||||
}
|
||||
},
|
||||
computed: {
|
||||
'timestamp': function () {
|
||||
return new Date(this.item.timestamp).toLocaleString();
|
||||
},
|
||||
|
||||
}
|
||||
};
|
||||
</script>
|
||||
|
||||
<style scoped>
|
||||
|
||||
|
||||
a {
|
||||
color: inherit;
|
||||
}
|
||||
|
||||
|
||||
.timeline-item-description {
|
||||
display: flex;
|
||||
padding-top: 6px;
|
||||
gap: 8px;
|
||||
color: var(--gray);
|
||||
|
||||
img {
|
||||
flex-shrink: 0;
|
||||
}
|
||||
|
||||
a {
|
||||
/*color: var(--c-grey-500);*/
|
||||
font-weight: 500;
|
||||
text-decoration: none;
|
||||
|
||||
&:hover,
|
||||
&:focus {
|
||||
outline: 0; /* Don't actually do this */
|
||||
color: var(--info);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
.avatar {
|
||||
display: inline-flex;
|
||||
align-items: center;
|
||||
justify-content: center;
|
||||
border-radius: 50%;
|
||||
overflow: hidden;
|
||||
aspect-ratio: 1 / 1;
|
||||
flex-shrink: 0;
|
||||
width: 40px;
|
||||
height: 40px;
|
||||
|
||||
&.small {
|
||||
width: 28px;
|
||||
height: 28px;
|
||||
}
|
||||
|
||||
img {
|
||||
object-fit: cover;
|
||||
}
|
||||
}
|
||||
|
||||
</style>
|
83
web/src/components/TimelineReturned.vue
Normal file
83
web/src/components/TimelineReturned.vue
Normal file
|
@ -0,0 +1,83 @@
|
|||
<template>
|
||||
<div class="timeline-item-description"><span>marked returned by
|
||||
<i class="avatar | small">
|
||||
<font-awesome-icon icon="user"/>
|
||||
</i>
|
||||
<a href="#">$USER</a> at <time :datetime="timestamp">{{ timestamp }}</time></span>
|
||||
</div>
|
||||
</template>
|
||||
|
||||
<script>
|
||||
|
||||
import {mapState} from "vuex";
|
||||
|
||||
export default {
|
||||
name: 'TimelineReturned',
|
||||
props: {
|
||||
'item': {
|
||||
type: Object,
|
||||
required: true
|
||||
}
|
||||
},
|
||||
computed: {
|
||||
'timestamp': function () {
|
||||
return new Date(this.item.timestamp).toLocaleString();
|
||||
},
|
||||
|
||||
}
|
||||
};
|
||||
</script>
|
||||
|
||||
<style scoped>
|
||||
|
||||
|
||||
a {
|
||||
color: inherit;
|
||||
}
|
||||
|
||||
|
||||
.timeline-item-description {
|
||||
display: flex;
|
||||
padding-top: 6px;
|
||||
gap: 8px;
|
||||
color: var(--gray);
|
||||
|
||||
img {
|
||||
flex-shrink: 0;
|
||||
}
|
||||
|
||||
a {
|
||||
/*color: var(--c-grey-500);*/
|
||||
font-weight: 500;
|
||||
text-decoration: none;
|
||||
|
||||
&:hover,
|
||||
&:focus {
|
||||
outline: 0; /* Don't actually do this */
|
||||
color: var(--info);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
.avatar {
|
||||
display: inline-flex;
|
||||
align-items: center;
|
||||
justify-content: center;
|
||||
border-radius: 50%;
|
||||
overflow: hidden;
|
||||
aspect-ratio: 1 / 1;
|
||||
flex-shrink: 0;
|
||||
width: 40px;
|
||||
height: 40px;
|
||||
|
||||
&.small {
|
||||
width: 28px;
|
||||
height: 28px;
|
||||
}
|
||||
|
||||
img {
|
||||
object-fit: cover;
|
||||
}
|
||||
}
|
||||
|
||||
</style>
|
|
@ -1,9 +1,9 @@
|
|||
<template>
|
||||
<button @click.stop="handleClick" :disabled="disabled">
|
||||
<button @click.stop="handleClick" :disabled="disabled || inProgress">
|
||||
<span class="spinner-border spinner-border-sm" role="status" aria-hidden="true"
|
||||
:class="{'d-none': !disabled}"></span>
|
||||
<span class="ml-2" :class="{'d-none': !disabled}">In Progress...</span>
|
||||
<span :class="{'d-none': disabled}"><slot></slot></span>
|
||||
:class="{'d-none': !inProgress}"></span>
|
||||
<span class="ml-2" :class="{'d-none': !inProgress}">In Progress...</span>
|
||||
<span :class="{'d-none': inProgress}"><slot></slot></span>
|
||||
</button>
|
||||
</template>
|
||||
|
||||
|
@ -13,7 +13,7 @@ export default {
|
|||
name: 'AsyncButton',
|
||||
data() {
|
||||
return {
|
||||
disabled: false,
|
||||
inProgress: false,
|
||||
};
|
||||
},
|
||||
props: {
|
||||
|
@ -21,17 +21,21 @@ export default {
|
|||
type: Function,
|
||||
required: true,
|
||||
},
|
||||
disabled: {
|
||||
type: Boolean,
|
||||
required: false,
|
||||
},
|
||||
},
|
||||
methods: {
|
||||
async handleClick() {
|
||||
if (this.task && typeof this.task === 'function') {
|
||||
this.disabled = true;
|
||||
this.inProgress = true;
|
||||
try {
|
||||
await this.task();
|
||||
} catch (e) {
|
||||
console.error(e);
|
||||
} finally {
|
||||
this.disabled = false;
|
||||
this.inProgress = false;
|
||||
}
|
||||
}
|
||||
},
|
||||
|
|
|
@ -61,7 +61,6 @@ const store = createStore({
|
|||
'2kg-de': '2kg Paket (DE)',
|
||||
'5kg-de': '5kg Paket (DE)',
|
||||
'10kg-de': '10kg Paket (DE)',
|
||||
'2kg-eu': '2kg Paket (EU)',
|
||||
'5kg-eu': '5kg Paket (EU)',
|
||||
'10kg-eu': '10kg Paket (EU)',
|
||||
}
|
||||
|
@ -77,10 +76,26 @@ const store = createStore({
|
|||
getEventTickets: (state, getters) => getters.getEventSlug === 'all' ? getters.getAllTickets : getters.getAllTickets.filter(t => t.event === getters.getEventSlug || (t.event == null && getters.getEventSlug === 'none')),
|
||||
isItemsLoaded: (state, getters) => (getters.getEventSlug === 'all' || getters.getEventSlug === 'none') ? !!state.loadedItems : Object.keys(state.loadedItems).includes(getters.getEventSlug),
|
||||
isTicketsLoaded: (state, getters) => (getters.getEventSlug === 'all' || getters.getEventSlug === 'none') ? !!state.loadedTickets : Object.keys(state.loadedTickets).includes(getters.getEventSlug),
|
||||
getItemsSearchResults: (state, getters) => state.loadedItemSearchResults[getters.getEventSlug + '/' + base64.encode(utf8.encode(getters.searchQuery))] || [],
|
||||
getTicketsSearchResults: (state, getters) => state.loadedTicketSearchResults[getters.getEventSlug + '/' + base64.encode(utf8.encode(getters.searchQuery))] || [],
|
||||
isItemsSearchLoaded: (state, getters) => Object.keys(state.loadedItemSearchResults).includes(getters.getEventSlug + '/' + base64.encode(utf8.encode(getters.searchQuery))),
|
||||
isTicketsSearchLoaded: (state, getters) => Object.keys(state.loadedTicketSearchResults).includes(getters.getEventSlug + '/' + base64.encode(utf8.encode(getters.searchQuery))),
|
||||
getItemsSearchResults: (state, getters) => {
|
||||
if (getters.getEventSlug === 'all') {
|
||||
return state.events.map(e => {
|
||||
return state.loadedItemSearchResults[e.slug + '/' + base64.encode(utf8.encode(getters.searchQuery))] || []
|
||||
}).flat();
|
||||
} else {
|
||||
return state.loadedItemSearchResults[getters.getEventSlug + '/' + base64.encode(utf8.encode(getters.searchQuery))] || []
|
||||
}
|
||||
},
|
||||
getTicketsSearchResults: (state, getters) => {
|
||||
if (getters.getEventSlug === 'all') {
|
||||
return state.events.map(e => {
|
||||
return state.loadedTicketSearchResults[e.slug + '/' + base64.encode(utf8.encode(getters.searchQuery))] || []
|
||||
}).flat();
|
||||
} else {
|
||||
return state.loadedTicketSearchResults[getters.getEventSlug + '/' + base64.encode(utf8.encode(getters.searchQuery))] || []
|
||||
}
|
||||
},
|
||||
isItemsSearchLoaded: (state, getters) => Object.keys(state.loadedItemSearchResults).includes(getters.getEventSlug + '/' + base64.encode(utf8.encode(getters.searchQuery))) || getters.getEventSlug === 'all',
|
||||
isTicketsSearchLoaded: (state, getters) => Object.keys(state.loadedTicketSearchResults).includes(getters.getEventSlug + '/' + base64.encode(utf8.encode(getters.searchQuery))) || getters.getEventSlug === 'all',
|
||||
getActiveView: state => router.currentRoute.value.name || 'items',
|
||||
getFilters: state => router.currentRoute.value.query,
|
||||
getBoxes: state => state.loadedBoxes,
|
||||
|
@ -379,26 +394,39 @@ const store = createStore({
|
|||
},
|
||||
async loadEventItems({commit, getters, state}) {
|
||||
if (!state.user.token) return;
|
||||
if (state.fetchedData.items > Date.now() - 1000 * 60 * 60 * 24) return;
|
||||
try {
|
||||
const slug = getters.getEventSlug;
|
||||
const {data, success} = await getters.session.get(`/2/${slug}/items/`);
|
||||
if (data && success) {
|
||||
commit('setItems', {slug, items: data});
|
||||
const load = async (slug) => {
|
||||
try {
|
||||
const {data, success} = await getters.session.get(`/2/${slug}/items/`);
|
||||
if (data && success) {
|
||||
commit('setItems', {slug, items: data});
|
||||
}
|
||||
} catch (e) {
|
||||
console.error("Error loading items");
|
||||
}
|
||||
} catch (e) {
|
||||
console.error("Error loading items");
|
||||
}
|
||||
const slug = getters.getEventSlug;
|
||||
if (slug === 'all') {
|
||||
await Promise.all(state.events.map(e => load(e.slug)));
|
||||
} else {
|
||||
await load(slug);
|
||||
}
|
||||
},
|
||||
async searchEventItems({commit, getters, state}, query) {
|
||||
const encoded_query = base64.encode(utf8.encode(query));
|
||||
const load = async (slug) => {
|
||||
if (Object.keys(state.loadedItemSearchResults).includes(slug + '/' + encoded_query)) return;
|
||||
const {
|
||||
data, success
|
||||
} = await getters.session.get(`/2/${slug}/items/${encoded_query}/`);
|
||||
if (data && success) {
|
||||
commit('setItemSearchResults', {slug, query: encoded_query, items: data});
|
||||
}
|
||||
}
|
||||
const slug = getters.getEventSlug;
|
||||
if (Object.keys(state.loadedItemSearchResults).includes(slug + '/' + encoded_query)) return;
|
||||
const {
|
||||
data, success
|
||||
} = await getters.session.get(`/2/${slug}/items/${encoded_query}/`);
|
||||
if (data && success) {
|
||||
commit('setItemSearchResults', {slug, query: encoded_query, items: data});
|
||||
if (slug === 'all') {
|
||||
await Promise.all(state.events.map(e => load(e.slug)));
|
||||
} else {
|
||||
await load(slug);
|
||||
}
|
||||
},
|
||||
async loadBoxes({commit, state, getters}) {
|
||||
|
@ -446,12 +474,19 @@ const store = createStore({
|
|||
},
|
||||
async searchEventTickets({commit, getters, state}, query) {
|
||||
const encoded_query = base64.encode(utf8.encode(query));
|
||||
const load = async (slug) => {
|
||||
if (Object.keys(state.loadedTicketSearchResults).includes(slug + '/' + encoded_query)) return;
|
||||
const {
|
||||
data, success
|
||||
} = await getters.session.get(`/2/${slug}/tickets/${encoded_query}/`);
|
||||
if (data && success) commit('setTicketSearchResults', {slug, query: encoded_query, items: data});
|
||||
}
|
||||
const slug = getters.getEventSlug;
|
||||
if (Object.keys(state.loadedTicketSearchResults).includes(slug + '/' + encoded_query)) return;
|
||||
const {
|
||||
data, success
|
||||
} = await getters.session.get(`/2/${slug}/tickets/${encoded_query}/`);
|
||||
if (data && success) commit('setTicketSearchResults', {slug, query: encoded_query, items: data});
|
||||
if (slug === 'all') {
|
||||
await Promise.all(state.events.map(e => load(e.slug)));
|
||||
} else {
|
||||
await load(slug);
|
||||
}
|
||||
},
|
||||
async sendMail({commit, dispatch, state, getters}, {id, message}) {
|
||||
const {data, success} = await getters.session.post(`/2/tickets/${id}/reply/`, {message},
|
||||
|
@ -528,6 +563,14 @@ const store = createStore({
|
|||
state.fetchedData.tickets = 0;
|
||||
await Promise.all([dispatch('loadTickets'), dispatch('fetchShippingVouchers')]);
|
||||
}
|
||||
},
|
||||
async linkTicketItem({dispatch, state, getters}, {ticket_id, item_id}) {
|
||||
const {data, success} = await getters.session.post(`/2/matches/`, {issue_thread: ticket_id, item: item_id});
|
||||
if (data && success) {
|
||||
state.fetchedData.tickets = 0;
|
||||
state.fetchedData.items = 0;
|
||||
await Promise.all([dispatch('loadTickets'), dispatch('loadEventItems')]);
|
||||
}
|
||||
}
|
||||
},
|
||||
plugins: [persistentStatePlugin({ // TODO change remember to some kind of enable field
|
||||
|
|
|
@ -17,7 +17,7 @@
|
|||
<textarea placeholder="add comment..." v-model="newComment"
|
||||
class="form-control">
|
||||
</textarea>
|
||||
<AsyncButton class="btn btn-primary float-right" :task="addCommentAndClear">
|
||||
<AsyncButton class="btn btn-secondary float-right" :task="addCommentAndClear" :disabled="!newComment">
|
||||
<font-awesome-icon icon="comment"/>
|
||||
Save Comment
|
||||
</AsyncButton>
|
||||
|
@ -25,7 +25,7 @@
|
|||
</div>
|
||||
</template>
|
||||
<template v-slot:timeline_action2>
|
||||
<span class="timeline-item-icon | faded-icon">
|
||||
<span class="timeline-item-icon | filled-icon">
|
||||
<font-awesome-icon icon="envelope"/>
|
||||
</span>
|
||||
<div class="new-mail card bg-dark">
|
||||
|
@ -35,7 +35,7 @@
|
|||
<div>
|
||||
<textarea placeholder="reply mail..." v-model="newMail" class="form-control">
|
||||
</textarea>
|
||||
<AsyncButton class="btn btn-primary float-right" :task="sendMailAndClear">
|
||||
<AsyncButton class="btn btn-primary float-right" :task="sendMailAndClear" :disabled="!newMail">
|
||||
<font-awesome-icon icon="envelope"/>
|
||||
Send Mail
|
||||
</AsyncButton>
|
||||
|
@ -81,6 +81,13 @@
|
|||
<font-awesome-icon icon="clipboard"/>
|
||||
Copy DHL contact to clipboard
|
||||
</ClipboardButton>
|
||||
<div class="btn-group">
|
||||
<input type="text" class="form-control" v-model="item_id">
|
||||
<button class="form-control btn btn-success" :disabled="!item_id"
|
||||
@click="linkTicketItem({ticket_id: ticket.id, item_id: parseInt(item_id)}).then(()=>item_id='')">
|
||||
Link Item
|
||||
</button>
|
||||
</div>
|
||||
<div class="btn-group">
|
||||
<select class="form-control" v-model="shipping_voucher_type">
|
||||
<option v-for="type in availableShippingVoucherTypes.filter(t=>t.count>0)"
|
||||
|
@ -141,6 +148,7 @@ export default {
|
|||
selected_state: null,
|
||||
selected_assignee: null,
|
||||
shipping_voucher_type: null,
|
||||
item_id: "",
|
||||
newMail: "",
|
||||
newComment: ""
|
||||
}
|
||||
|
@ -166,6 +174,7 @@ export default {
|
|||
...mapActions(['deleteItem', 'markItemReturned', 'sendMail', 'updateTicketPartial', 'postComment']),
|
||||
...mapActions(['loadTickets', 'fetchTicketStates', 'loadUsers', 'scheduleAfterInit']),
|
||||
...mapActions(['claimShippingVoucher', 'fetchShippingVouchers']),
|
||||
...mapActions(['linkTicketItem']),
|
||||
...mapMutations(['openLightboxModalWith']),
|
||||
changeTicketStatus() {
|
||||
this.ticket.state = this.selected_state;
|
||||
|
@ -198,10 +207,10 @@ export default {
|
|||
},
|
||||
mounted() {
|
||||
this.scheduleAfterInit(() => [Promise.all([this.fetchTicketStates(), this.loadTickets(), this.loadUsers(), this.fetchShippingVouchers()]).then(() => {
|
||||
if (this.ticket.state === "pending_new") {
|
||||
this.selected_state = "pending_open";
|
||||
this.changeTicketStatus()
|
||||
}
|
||||
//if (this.ticket.state === "pending_new") {
|
||||
// this.selected_state = "pending_open";
|
||||
// this.changeTicketStatus()
|
||||
//}
|
||||
this.selected_state = this.ticket.state;
|
||||
this.selected_assignee = this.ticket.assigned_to
|
||||
})]);
|
||||
|
|
|
@ -25,7 +25,7 @@
|
|||
:columns="['id', 'name', 'last_activity', 'assigned_to',
|
||||
...(getEventSlug==='all'?['event']:[])]"
|
||||
:keyName="'state'" :sections="['pending_new', 'pending_open','pending_shipping',
|
||||
'pending_physical_confirmation','pending_return','pending_postponed'].map(stateInfo)">
|
||||
'pending_physical_confirmation','pending_return','pending_postponed','pending_suspected_spam'].map(stateInfo)">
|
||||
<template #section_header="{index, section, count}">
|
||||
{{ section.text }} <span class="badge badge-light ml-1">{{ count }}</span>
|
||||
</template>
|
||||
|
|
|
@ -26,7 +26,7 @@
|
|||
:columns="['id', 'name', 'last_activity', 'assigned_to',
|
||||
...(getEventSlug==='all'?['event']:[])]"
|
||||
:keyName="'state'" :sections="['pending_new', 'pending_open','pending_shipping',
|
||||
'pending_physical_confirmation','pending_return','pending_postponed'].map(stateInfo)">
|
||||
'pending_physical_confirmation','pending_return','pending_postponed','pending_suspected_spam'].map(stateInfo)">
|
||||
<template #section_header="{index, section, count}">
|
||||
{{ section.text }} <span class="badge badge-light ml-1">{{ count }}</span>
|
||||
</template>
|
||||
|
|
Loading…
Add table
Reference in a new issue