Compare commits
No commits in common. "testing" and "jedi/search" have entirely different histories.
testing
...
jedi/searc
34 changed files with 170 additions and 10340 deletions
158
README.md
158
README.md
|
@ -1,158 +0,0 @@
|
||||||
# C3LF System3
|
|
||||||
|
|
||||||
the third try to automate lost&found organization for chaos events. not a complete rewrite, but instead building on top
|
|
||||||
of the web frontend of version 2. everything else is new but still API compatible. now with more monorepo.
|
|
||||||
|
|
||||||
## Architecture
|
|
||||||
|
|
||||||
C3LF System3 integrates a Django-Rest-Framework + WebSocket backend, Vue.js frontend SPA and a minimal LMTP mail server
|
|
||||||
integrated with the Django backend. It is additionally deployed with a Postfix mail server as Proxy in front of the
|
|
||||||
LMTP socket, a MariaDB database, a Redis cache and an Nginx reverse proxy that serves the static SPA frontend, proxies
|
|
||||||
the API requests to the backend and serves the media files in cooperation with the Django backend using the
|
|
||||||
`X-Accel-Redirect` header.
|
|
||||||
|
|
||||||
The production deployment is automated using Ansible and there are some Docker Compose configurations for development.
|
|
||||||
|
|
||||||
## Project Structure
|
|
||||||
|
|
||||||
- `core/` Contains the Django backend with database models, API endpoints, migrations, API tests, and mail server
|
|
||||||
functionalities.
|
|
||||||
- `web/` Contains the Vue.js frontend application.
|
|
||||||
- `deploy/` Contains deployment configurations and Docker scripts for various development modes.
|
|
||||||
|
|
||||||
For more information, see the README.md files in the respective directories.
|
|
||||||
|
|
||||||
## Development Modes
|
|
||||||
|
|
||||||
There are currently 4 development modes for this Project:
|
|
||||||
|
|
||||||
- Frontend-Only
|
|
||||||
- Backend-API-Only
|
|
||||||
- Full-Stack-Lite 'dev' (docker)
|
|
||||||
- **[WIP]** Full-Stack 'testing' (docker)
|
|
||||||
|
|
||||||
*Choose the one that is most suited to the feature you want to work on or ist easiest for you to set up ;)*
|
|
||||||
|
|
||||||
For all modes it is assumed that you have `git` installed, have cloned the repository and are in the root directory of
|
|
||||||
the project. Use `git clone https://git.hannover.ccc.de/c3lf/c3lf-system-3.git` to get the official upstream repository.
|
|
||||||
The required packages for each mode are listed separately and also state the specific package name for Debian 12.
|
|
||||||
|
|
||||||
### Frontend-Only
|
|
||||||
|
|
||||||
This mode is for developing the frontend only. It uses the vue-cli-service (webpack) to serve the frontend and watches
|
|
||||||
for changes in the source code to provide hot reloading. The API requests are proxied to the staging backend.
|
|
||||||
|
|
||||||
#### Requirements
|
|
||||||
|
|
||||||
* Node.js (~20.19.0) (`nodejs`)
|
|
||||||
* npm (~9.2.0) (`npm`)
|
|
||||||
|
|
||||||
*Note: The versions are not strict, but these are tested. Other versions might work as well.*
|
|
||||||
|
|
||||||
#### Steps
|
|
||||||
|
|
||||||
```bash
|
|
||||||
cd web
|
|
||||||
npm intall
|
|
||||||
npm run serve
|
|
||||||
```
|
|
||||||
|
|
||||||
Now you can access the frontend at `localhost:8080` and start editing the code in the `web` directory.
|
|
||||||
For more information, see the README.md file in the `web` directory.
|
|
||||||
|
|
||||||
### Backend-API-Only
|
|
||||||
|
|
||||||
This mode is for developing the backend API only. It also specifically excludes most WebSockets and mail server
|
|
||||||
functionalities. Use this mode to focus on the backend API and Database models.
|
|
||||||
|
|
||||||
#### Requirements
|
|
||||||
|
|
||||||
* Python (~3.11) (`python3`)
|
|
||||||
* pip (`python3-pip`)
|
|
||||||
* virtualenv (`python3-venv`)
|
|
||||||
|
|
||||||
*Note: The versions are not strict, but these are tested. Other versions might work as well.*
|
|
||||||
|
|
||||||
#### Steps
|
|
||||||
|
|
||||||
```
|
|
||||||
python -m venv venv
|
|
||||||
source venv/bin/activate
|
|
||||||
pip install -r core/requirements.dev.txt
|
|
||||||
cd core
|
|
||||||
python manage.py test
|
|
||||||
```
|
|
||||||
|
|
||||||
The tests should run successfully to start and you can now start the TDD workflow by adding new failing tests.
|
|
||||||
For more information about the backend and TDD, see the README.md file in the `core` directory.
|
|
||||||
|
|
||||||
### Full-Stack-Lite 'dev' (docker)
|
|
||||||
|
|
||||||
This mode is for developing the both frontend and backend backend at the same time in a containerized environment. It
|
|
||||||
uses the `docker-compose` command to build and run the application in a container. It specifically excludes all mail
|
|
||||||
server and most WebSocket functionalities.
|
|
||||||
|
|
||||||
#### Requirements
|
|
||||||
|
|
||||||
* Docker (`docker.io`)
|
|
||||||
* Docker Compose (`docker-compose`)
|
|
||||||
|
|
||||||
*Note: Depending on your system, the `docker compose` command might be included in general `docker` or `docker-ce`
|
|
||||||
package, or you might want to use podman instead.*
|
|
||||||
|
|
||||||
#### Steps
|
|
||||||
|
|
||||||
```bash
|
|
||||||
docker-compose -f deploy/dev/docker-compose.yml up --build
|
|
||||||
```
|
|
||||||
|
|
||||||
The page should be available at [localhost:8080](http://localhost:8080)
|
|
||||||
This Mode provides a minimal set of testdata, including a user `testuser` with password `testuser`. The test dataset is
|
|
||||||
defined in deploy/testdata.py and can be extended there.
|
|
||||||
|
|
||||||
You can now edit code in `/web` and `/core` and changes will be applied to the running page as soon as the file is
|
|
||||||
saved.
|
|
||||||
|
|
||||||
For details about each part, read `/web/README.md` and `/core/README.md` respectively. To execute commands in the
|
|
||||||
container context use 'exec' like
|
|
||||||
|
|
||||||
```bash
|
|
||||||
docker exec -it c3lf-sys3-dev-core-1 ./manage.py test`
|
|
||||||
```
|
|
||||||
|
|
||||||
### Full-Stack 'testing' (docker)
|
|
||||||
|
|
||||||
**WORK IN PROGRESS**
|
|
||||||
|
|
||||||
*will include postfix, mariadb, redis, nginx and the ability to test sending mails, receiving mail and websocket based
|
|
||||||
realiteme updates in the frontend. the last step in verification before deploying to the staging system using ansible*
|
|
||||||
|
|
||||||
## Online Instances
|
|
||||||
|
|
||||||
These are deployed using `deploy/ansible/playbooks/deploy-c3lf-sys3.yml` and follow a specific git branch.
|
|
||||||
|
|
||||||
### 'live'
|
|
||||||
|
|
||||||
| URL | [c3lf.de](https://c3lf.de) |
|
|
||||||
|----------------|----------------------------|
|
|
||||||
| **Branch** | live |
|
|
||||||
| **Host** | polaris.lab.or.it |
|
|
||||||
| **Debug Mode** | off |
|
|
||||||
|
|
||||||
This is the **'production' system** and should strictly follow the staging system after all changes have been validated.
|
|
||||||
|
|
||||||
### 'staging'
|
|
||||||
|
|
||||||
| URL | [staging.c3lf.de](https://staging.c3lf.de) |
|
|
||||||
|----------------|--------------------------------------------|
|
|
||||||
| **Branch** | testing |
|
|
||||||
| **Host** | andromeda.lab.or.it |
|
|
||||||
| **Debug Mode** | on |
|
|
||||||
|
|
||||||
This system ist automatically updated by [git.hannover.ccc.de](https://git.hannover.ccc.de/c3lf/c3lf-system-3/) whenever
|
|
||||||
a commit is pushed to the 'testing' branch and the backend tests passed.
|
|
||||||
|
|
||||||
**WARNING: allthough this is the staging system, it is fully functional and contains a copy of the 'production' data, so
|
|
||||||
do not for example reply to tickets for testing purposes as the system WILL SEND AN EMAIL to the person who originally
|
|
||||||
created it. If you want to test something like that, first create you own test ticket by sending an email to
|
|
||||||
`<event>@staging.c3lf.de`**
|
|
|
@ -1,68 +0,0 @@
|
||||||
# Core
|
|
||||||
|
|
||||||
This directory contains the backend of the C3LF System3 project, which is built using Django and Django Rest Framework.
|
|
||||||
|
|
||||||
## Modules
|
|
||||||
|
|
||||||
- `authentication`: Handles user authentication and authorization.
|
|
||||||
- `files`: Manages file uploads and related operations.
|
|
||||||
- `inventory`: Handles inventory management, including events, containers and items.
|
|
||||||
- `mail`: Manages email-related functionalities, including sending and receiving emails.
|
|
||||||
- `notify_sessions`: Handles real-time notifications and WebSocket sessions.
|
|
||||||
- `tickets`: Manages the ticketing system for issue tracking.
|
|
||||||
|
|
||||||
## Modules Structure
|
|
||||||
|
|
||||||
Most modules follow a similar structure, including the following components:
|
|
||||||
|
|
||||||
- `<module>/models.py`: Contains the database models for the module.
|
|
||||||
- `<module>/serializers.py`: Contains the serializers for the module models.
|
|
||||||
- `<module>/api_<api_version>.py`: Contains the API views and endpoints for the module.
|
|
||||||
- `<module>/migrations/`: Contains database migration files. Needs to contain an `__init__.py` file to be recognized as
|
|
||||||
a Python package and automatically migration creation to work.
|
|
||||||
- `<module>/tests/<api_version>/test_<feature_model_or_testcase>.py`: Contains the test cases for the module.
|
|
||||||
|
|
||||||
## Development Setup
|
|
||||||
|
|
||||||
follow the instructions under 'Backend-API-Only' or 'Fullstack-Lite' in the root level `README.md` to set up a
|
|
||||||
development environment.
|
|
||||||
|
|
||||||
## Test-Driven Development (TDD) Workflow
|
|
||||||
|
|
||||||
The project follows a TDD workflow to ensure code quality and reliability. Here is a step-by-step guide to the TDD
|
|
||||||
process:
|
|
||||||
|
|
||||||
1. **Write a Test**: Start by writing a test case for the new feature or bug fix. Place the test case in the appropriate
|
|
||||||
module within the `<module>/tests/<api_version>/test_<feature_model_or_testcase>.py` file.
|
|
||||||
|
|
||||||
2. **Run the Test**: Execute the test to ensure it fails, confirming that the feature is not yet implemented or the bug
|
|
||||||
exists.
|
|
||||||
```bash
|
|
||||||
python manage.py test
|
|
||||||
```
|
|
||||||
|
|
||||||
3. **Write the Code**: Implement the code required to pass the test. Write the code in the appropriate module within the
|
|
||||||
project.
|
|
||||||
|
|
||||||
4. **Run the Test Again**: Execute the test again to ensure it passes.
|
|
||||||
```bash
|
|
||||||
python manage.py test
|
|
||||||
```
|
|
||||||
|
|
||||||
5. **Refactor**: Refactor the code to improve its structure and readability while ensuring that all tests still pass.
|
|
||||||
|
|
||||||
6. **Repeat**: Repeat the process for each new feature or bug fix.
|
|
||||||
|
|
||||||
## Measuring Test Coverage
|
|
||||||
|
|
||||||
The project uses the `coverage` package to measure test coverage. To generate a coverage report, run the following
|
|
||||||
command:
|
|
||||||
|
|
||||||
```bash
|
|
||||||
coverage run --source='.' manage.py test
|
|
||||||
coverage report
|
|
||||||
```
|
|
||||||
|
|
||||||
## Additional Information
|
|
||||||
|
|
||||||
For more detailed information on the project structure and development modes, refer to the root level `README.md`.
|
|
|
@ -3,20 +3,18 @@ from prometheus_client.core import CounterMetricFamily, REGISTRY
|
||||||
from django.db.models import Case, Value, When, BooleanField, Count
|
from django.db.models import Case, Value, When, BooleanField, Count
|
||||||
from inventory.models import Item
|
from inventory.models import Item
|
||||||
|
|
||||||
|
|
||||||
class ItemCountCollector(object):
|
class ItemCountCollector(object):
|
||||||
|
|
||||||
def collect(self):
|
def collect(self):
|
||||||
try:
|
counter = CounterMetricFamily("item_count", "Current number of items", labels=['event', 'returned_state'])
|
||||||
counter = CounterMetricFamily("item_count", "Current number of items", labels=['event', 'returned_state'])
|
|
||||||
|
|
||||||
yield counter
|
yield counter
|
||||||
|
|
||||||
if not apps.models_ready or not apps.apps_ready:
|
if not apps.models_ready or not apps.apps_ready:
|
||||||
return
|
return
|
||||||
|
|
||||||
queryset = (
|
queryset = (
|
||||||
Item.all_objects
|
Item.all_objects
|
||||||
.annotate(
|
.annotate(
|
||||||
returned=Case(
|
returned=Case(
|
||||||
When(returned_at__isnull=True, then=Value(False)),
|
When(returned_at__isnull=True, then=Value(False)),
|
||||||
|
@ -27,14 +25,11 @@ class ItemCountCollector(object):
|
||||||
.values('event__slug', 'returned', 'event_id')
|
.values('event__slug', 'returned', 'event_id')
|
||||||
.annotate(amount=Count('id'))
|
.annotate(amount=Count('id'))
|
||||||
.order_by('event__slug', 'returned') # Optional: order by slug and returned
|
.order_by('event__slug', 'returned') # Optional: order by slug and returned
|
||||||
)
|
)
|
||||||
|
|
||||||
for e in queryset:
|
for e in queryset:
|
||||||
counter.add_metric([e["event__slug"].lower(), str(e["returned"])], e["amount"])
|
counter.add_metric([e["event__slug"].lower(), str(e["returned"])], e["amount"])
|
||||||
|
|
||||||
yield counter
|
yield counter
|
||||||
except:
|
|
||||||
pass
|
|
||||||
|
|
||||||
|
REGISTRY.register(ItemCountCollector())
|
||||||
REGISTRY.register(ItemCountCollector())
|
|
|
@ -1,5 +1,6 @@
|
||||||
from django.core.files.base import ContentFile
|
from django.core.files.base import ContentFile
|
||||||
from django.db import models, IntegrityError
|
from django.db import models, IntegrityError
|
||||||
|
from django_softdelete.models import SoftDeleteModel
|
||||||
|
|
||||||
from inventory.models import Item
|
from inventory.models import Item
|
||||||
|
|
||||||
|
@ -9,8 +10,7 @@ def hash_upload(instance, filename):
|
||||||
|
|
||||||
|
|
||||||
class FileManager(models.Manager):
|
class FileManager(models.Manager):
|
||||||
|
def get_or_create(self, **kwargs):
|
||||||
def __file_data_helper(self, **kwargs):
|
|
||||||
if 'data' in kwargs and type(kwargs['data']) == str:
|
if 'data' in kwargs and type(kwargs['data']) == str:
|
||||||
import base64
|
import base64
|
||||||
from hashlib import sha256
|
from hashlib import sha256
|
||||||
|
@ -31,10 +31,6 @@ class FileManager(models.Manager):
|
||||||
pass
|
pass
|
||||||
else:
|
else:
|
||||||
raise ValueError('data must be a base64 encoded string or file and hash must be provided')
|
raise ValueError('data must be a base64 encoded string or file and hash must be provided')
|
||||||
return kwargs
|
|
||||||
|
|
||||||
def get_or_create(self, **kwargs):
|
|
||||||
kwargs = self.__file_data_helper(**kwargs)
|
|
||||||
try:
|
try:
|
||||||
return self.get(hash=kwargs['hash']), False
|
return self.get(hash=kwargs['hash']), False
|
||||||
except self.model.DoesNotExist:
|
except self.model.DoesNotExist:
|
||||||
|
@ -43,7 +39,26 @@ class FileManager(models.Manager):
|
||||||
return obj, True
|
return obj, True
|
||||||
|
|
||||||
def create(self, **kwargs):
|
def create(self, **kwargs):
|
||||||
kwargs = self.__file_data_helper(**kwargs)
|
if 'data' in kwargs and type(kwargs['data']) == str:
|
||||||
|
import base64
|
||||||
|
from hashlib import sha256
|
||||||
|
raw = kwargs['data']
|
||||||
|
if not raw.startswith('data:'):
|
||||||
|
raise ValueError('data must be a base64 encoded string or file and hash must be provided')
|
||||||
|
raw = raw.split(';base64,')
|
||||||
|
if len(raw) != 2:
|
||||||
|
raise ValueError('data must be a base64 encoded string or file and hash must be provided')
|
||||||
|
mime_type = raw[0].split(':')[1]
|
||||||
|
content = base64.b64decode(raw[1], validate=True)
|
||||||
|
kwargs.pop('data')
|
||||||
|
content_hash = sha256(content).hexdigest()
|
||||||
|
kwargs['file'] = ContentFile(content, content_hash)
|
||||||
|
kwargs['hash'] = content_hash
|
||||||
|
kwargs['mime_type'] = mime_type
|
||||||
|
elif 'file' in kwargs and 'hash' in kwargs and type(kwargs['file']) == ContentFile and 'mime_type' in kwargs:
|
||||||
|
pass
|
||||||
|
else:
|
||||||
|
raise ValueError('data must be a base64 encoded string or file and hash must be provided')
|
||||||
if not self.filter(hash=kwargs['hash']).exists():
|
if not self.filter(hash=kwargs['hash']).exists():
|
||||||
obj = super().create(**kwargs)
|
obj = super().create(**kwargs)
|
||||||
obj.file.save(content=kwargs['file'], name=kwargs['hash'])
|
obj.file.save(content=kwargs['file'], name=kwargs['hash'])
|
||||||
|
|
|
@ -39,61 +39,13 @@ class ItemViewSet(viewsets.ModelViewSet):
|
||||||
|
|
||||||
def filter_items(items, query):
|
def filter_items(items, query):
|
||||||
query_tokens = query.split(' ')
|
query_tokens = query.split(' ')
|
||||||
matches = []
|
|
||||||
for item in items:
|
for item in items:
|
||||||
value = 0
|
value = 0
|
||||||
if "I#" + str(item.id) in query:
|
|
||||||
value += 12
|
|
||||||
matches.append(
|
|
||||||
{'type': 'item_id', 'text': f'is exactly {item.id} and matched "I#{item.id}"'})
|
|
||||||
elif "#" + str(item.id) in query:
|
|
||||||
value += 11
|
|
||||||
matches.append(
|
|
||||||
{'type': 'item_id', 'text': f'is exactly {item.id} and matched "#{item.id}"'})
|
|
||||||
elif str(item.id) in query:
|
|
||||||
value += 10
|
|
||||||
matches.append({'type': 'item_id', 'text': f'is exactly {item.id}'})
|
|
||||||
for issue in item.related_issues:
|
|
||||||
if "T#" + issue.short_uuid() in query:
|
|
||||||
value += 8
|
|
||||||
matches.append({'type': 'ticket_uuid',
|
|
||||||
'text': f'is exactly {issue.short_uuid()} and matched "T#{issue.short_uuid()}"'})
|
|
||||||
elif "#" + issue.short_uuid() in query:
|
|
||||||
value += 5
|
|
||||||
matches.append({'type': 'ticket_uuid',
|
|
||||||
'text': f'is exactly {issue.short_uuid()} and matched "#{issue.short_uuid()}"'})
|
|
||||||
elif issue.short_uuid() in query:
|
|
||||||
value += 3
|
|
||||||
matches.append({'type': 'ticket_uuid', 'text': f'is exactly {issue.short_uuid()}'})
|
|
||||||
if "T#" + str(issue.id) in query:
|
|
||||||
value += 8
|
|
||||||
matches.append({'type': 'ticket_id', 'text': f'is exactly {issue.id} and matched "T#{issue.id}"'})
|
|
||||||
elif "#" + str(issue.id) in query:
|
|
||||||
value += 5
|
|
||||||
matches.append({'type': 'ticket_id', 'text': f'is exactly {issue.id} and matched "#{issue.id}"'})
|
|
||||||
elif str(issue.id) in query:
|
|
||||||
value += 3
|
|
||||||
matches.append({'type': 'ticket_id', 'text': f'is exactly {issue.id}'})
|
|
||||||
for comment in issue.comments.all():
|
|
||||||
for token in query_tokens:
|
|
||||||
if token in comment.comment:
|
|
||||||
value += 1
|
|
||||||
matches.append({'type': 'ticket_comment', 'text': f'contains {token}'})
|
|
||||||
for token in query_tokens:
|
|
||||||
if token in issue.name:
|
|
||||||
value += 1
|
|
||||||
matches.append({'type': 'ticket_name', 'text': f'contains {token}'})
|
|
||||||
for token in query_tokens:
|
for token in query_tokens:
|
||||||
if token in item.description:
|
if token in item.description:
|
||||||
value += 1
|
value += 1
|
||||||
matches.append({'type': 'item_description', 'text': f'contains {token}'})
|
|
||||||
for comment in item.comments.all():
|
|
||||||
for token in query_tokens:
|
|
||||||
if token in comment.comment:
|
|
||||||
value += 1
|
|
||||||
matches.append({'type': 'comment', 'text': f'contains {token}'})
|
|
||||||
if value > 0:
|
if value > 0:
|
||||||
yield {'search_score': value, 'item': item, 'search_matches': matches}
|
yield {'search_score': value, 'item': item}
|
||||||
|
|
||||||
|
|
||||||
@api_view(['GET'])
|
@api_view(['GET'])
|
||||||
|
|
|
@ -1,9 +1,6 @@
|
||||||
from itertools import groupby
|
from itertools import groupby
|
||||||
|
|
||||||
from django.db import models
|
from django.db import models
|
||||||
from django.db.models.signals import pre_save
|
|
||||||
from django.dispatch import receiver
|
|
||||||
from django.utils import timezone
|
|
||||||
from django_softdelete.models import SoftDeleteModel, SoftDeleteManager
|
from django_softdelete.models import SoftDeleteModel, SoftDeleteManager
|
||||||
|
|
||||||
|
|
||||||
|
@ -67,11 +64,6 @@ class Item(SoftDeleteModel):
|
||||||
return '[' + str(self.id) + ']' + self.description
|
return '[' + str(self.id) + ']' + self.description
|
||||||
|
|
||||||
|
|
||||||
@receiver(pre_save, sender=Item)
|
|
||||||
def item_updated(sender, instance, **kwargs):
|
|
||||||
instance.updated_at = timezone.now()
|
|
||||||
|
|
||||||
|
|
||||||
class Container(SoftDeleteModel):
|
class Container(SoftDeleteModel):
|
||||||
id = models.AutoField(primary_key=True)
|
id = models.AutoField(primary_key=True)
|
||||||
name = models.CharField(max_length=255)
|
name = models.CharField(max_length=255)
|
||||||
|
|
|
@ -132,33 +132,15 @@ class ItemSerializer(BasicItemSerializer):
|
||||||
'cid': placement.container.id,
|
'cid': placement.container.id,
|
||||||
'box': placement.container.name
|
'box': placement.container.name
|
||||||
})
|
})
|
||||||
|
|
||||||
if obj.created_at:
|
|
||||||
timeline.append({
|
|
||||||
'type': 'created',
|
|
||||||
'timestamp': obj.created_at,
|
|
||||||
})
|
|
||||||
if obj.returned_at:
|
|
||||||
timeline.append({
|
|
||||||
'type': 'returned',
|
|
||||||
'timestamp': obj.returned_at,
|
|
||||||
})
|
|
||||||
if obj.deleted_at:
|
|
||||||
timeline.append({
|
|
||||||
'type': 'deleted',
|
|
||||||
'timestamp': obj.deleted_at,
|
|
||||||
})
|
|
||||||
return sorted(timeline, key=lambda x: x['timestamp'])
|
return sorted(timeline, key=lambda x: x['timestamp'])
|
||||||
|
|
||||||
|
|
||||||
class SearchResultSerializer(serializers.Serializer):
|
class SearchResultSerializer(serializers.Serializer):
|
||||||
search_score = serializers.IntegerField()
|
search_score = serializers.IntegerField()
|
||||||
search_matches = serializers.ListField(child=serializers.DictField())
|
|
||||||
item = ItemSerializer()
|
item = ItemSerializer()
|
||||||
|
|
||||||
def to_representation(self, instance):
|
def to_representation(self, instance):
|
||||||
return {**ItemSerializer(instance['item']).data, 'search_score': instance['search_score'],
|
return {**ItemSerializer(instance['item']).data, 'search_score': instance['search_score']}
|
||||||
'search_matches': instance['search_matches']}
|
|
||||||
|
|
||||||
class Meta:
|
class Meta:
|
||||||
model = Item
|
model = Item
|
||||||
|
|
|
@ -63,28 +63,28 @@ class ItemTestCase(TestCase):
|
||||||
self.assertEqual(response.json()[0]['file'], None)
|
self.assertEqual(response.json()[0]['file'], None)
|
||||||
self.assertEqual(response.json()[0]['returned'], False)
|
self.assertEqual(response.json()[0]['returned'], False)
|
||||||
self.assertEqual(response.json()[0]['event'], self.event.slug)
|
self.assertEqual(response.json()[0]['event'], self.event.slug)
|
||||||
self.assertEqual(len(response.json()[0]['timeline']), 5)
|
self.assertEqual(len(response.json()[0]['timeline']), 4)
|
||||||
self.assertEqual(response.json()[0]['timeline'][0]['type'], 'created')
|
self.assertEqual(response.json()[0]['timeline'][0]['type'], 'placement')
|
||||||
self.assertEqual(response.json()[0]['timeline'][1]['type'], 'placement')
|
self.assertEqual(response.json()[0]['timeline'][1]['type'], 'comment')
|
||||||
self.assertEqual(response.json()[0]['timeline'][2]['type'], 'comment')
|
self.assertEqual(response.json()[0]['timeline'][2]['type'], 'issue_relation')
|
||||||
self.assertEqual(response.json()[0]['timeline'][3]['type'], 'issue_relation')
|
self.assertEqual(response.json()[0]['timeline'][3]['type'], 'placement')
|
||||||
self.assertEqual(response.json()[0]['timeline'][4]['type'], 'placement')
|
self.assertEqual(response.json()[0]['timeline'][1]['id'], comment.id)
|
||||||
self.assertEqual(response.json()[0]['timeline'][2]['id'], comment.id)
|
self.assertEqual(response.json()[0]['timeline'][2]['id'], match.id)
|
||||||
self.assertEqual(response.json()[0]['timeline'][3]['id'], match.id)
|
self.assertEqual(response.json()[0]['timeline'][3]['id'], placement.id)
|
||||||
self.assertEqual(response.json()[0]['timeline'][4]['id'], placement.id)
|
self.assertEqual(response.json()[0]['timeline'][0]['box'], 'BOX1')
|
||||||
self.assertEqual(response.json()[0]['timeline'][1]['box'], 'BOX1')
|
self.assertEqual(response.json()[0]['timeline'][0]['cid'], self.box1.id)
|
||||||
self.assertEqual(response.json()[0]['timeline'][1]['cid'], self.box1.id)
|
self.assertEqual(response.json()[0]['timeline'][1]['comment'], 'test')
|
||||||
self.assertEqual(response.json()[0]['timeline'][0]['timestamp'], item.created_at.strftime('%Y-%m-%dT%H:%M:%S.%fZ'))
|
self.assertEqual(response.json()[0]['timeline'][1]['timestamp'],
|
||||||
self.assertEqual(response.json()[0]['timeline'][2]['comment'], 'test')
|
comment.timestamp.strftime('%Y-%m-%dT%H:%M:%S.%fZ'))
|
||||||
self.assertEqual(response.json()[0]['timeline'][2]['timestamp'], comment.timestamp.strftime('%Y-%m-%dT%H:%M:%S.%fZ'))
|
self.assertEqual(response.json()[0]['timeline'][2]['status'], 'possible')
|
||||||
self.assertEqual(response.json()[0]['timeline'][3]['status'], 'possible')
|
self.assertEqual(response.json()[0]['timeline'][2]['timestamp'],
|
||||||
self.assertEqual(response.json()[0]['timeline'][3]['timestamp'], match.timestamp.strftime('%Y-%m-%dT%H:%M:%S.%fZ'))
|
match.timestamp.strftime('%Y-%m-%dT%H:%M:%S.%fZ'))
|
||||||
self.assertEqual(response.json()[0]['timeline'][3]['issue_thread']['name'], "test issue")
|
self.assertEqual(response.json()[0]['timeline'][2]['issue_thread']['name'], "test issue")
|
||||||
self.assertEqual(response.json()[0]['timeline'][3]['issue_thread']['event'], "EVENT")
|
self.assertEqual(response.json()[0]['timeline'][2]['issue_thread']['event'], "EVENT")
|
||||||
self.assertEqual(response.json()[0]['timeline'][3]['issue_thread']['state'], "pending_new")
|
self.assertEqual(response.json()[0]['timeline'][2]['issue_thread']['state'], "pending_new")
|
||||||
self.assertEqual(response.json()[0]['timeline'][4]['box'], 'BOX2')
|
self.assertEqual(response.json()[0]['timeline'][3]['box'], 'BOX2')
|
||||||
self.assertEqual(response.json()[0]['timeline'][4]['cid'], self.box2.id)
|
self.assertEqual(response.json()[0]['timeline'][3]['cid'], self.box2.id)
|
||||||
self.assertEqual(response.json()[0]['timeline'][4]['timestamp'],
|
self.assertEqual(response.json()[0]['timeline'][3]['timestamp'],
|
||||||
placement.timestamp.strftime('%Y-%m-%dT%H:%M:%S.%fZ'))
|
placement.timestamp.strftime('%Y-%m-%dT%H:%M:%S.%fZ'))
|
||||||
self.assertEqual(len(response.json()[0]['related_issues']), 1)
|
self.assertEqual(len(response.json()[0]['related_issues']), 1)
|
||||||
self.assertEqual(response.json()[0]['related_issues'][0]['name'], "test issue")
|
self.assertEqual(response.json()[0]['related_issues'][0]['name'], "test issue")
|
||||||
|
|
|
@ -53,12 +53,6 @@ def unescape_simplified_quoted_printable(s, encoding='utf-8'):
|
||||||
return quopri.decodestring(s).decode(encoding)
|
return quopri.decodestring(s).decode(encoding)
|
||||||
|
|
||||||
|
|
||||||
def decode_inline_encodings(s):
|
|
||||||
s = unescape_and_decode_quoted_printable(s)
|
|
||||||
s = unescape_and_decode_base64(s)
|
|
||||||
return s
|
|
||||||
|
|
||||||
|
|
||||||
def ascii_strip(s):
|
def ascii_strip(s):
|
||||||
if not s:
|
if not s:
|
||||||
return None
|
return None
|
||||||
|
@ -93,17 +87,17 @@ def make_reply(reply_email, references=None, event=None):
|
||||||
reply_email.save()
|
reply_email.save()
|
||||||
if references:
|
if references:
|
||||||
reply["References"] = " ".join(references)
|
reply["References"] = " ".join(references)
|
||||||
if reply_email.body != "":
|
|
||||||
reply.set_content(reply_email.body)
|
reply.set_content(reply_email.body)
|
||||||
return reply
|
|
||||||
else:
|
return reply
|
||||||
raise SpecialMailException("mail content emty")
|
|
||||||
|
|
||||||
async def send_smtp(message):
|
async def send_smtp(message):
|
||||||
await aiosmtplib.send(message, hostname="127.0.0.1", port=25, use_tls=False, start_tls=False)
|
await aiosmtplib.send(message, hostname="127.0.0.1", port=25, use_tls=False, start_tls=False)
|
||||||
|
|
||||||
|
|
||||||
def find_active_issue_thread(in_reply_to, address, subject, event, spam=False):
|
def find_active_issue_thread(in_reply_to, address, subject, event):
|
||||||
from re import match
|
from re import match
|
||||||
uuid_match = match(r'^ticket\+([a-f0-9-]{36})@', address)
|
uuid_match = match(r'^ticket\+([a-f0-9-]{36})@', address)
|
||||||
if uuid_match:
|
if uuid_match:
|
||||||
|
@ -114,8 +108,7 @@ def find_active_issue_thread(in_reply_to, address, subject, event, spam=False):
|
||||||
if reply_to.exists():
|
if reply_to.exists():
|
||||||
return reply_to.first().issue_thread, False
|
return reply_to.first().issue_thread, False
|
||||||
else:
|
else:
|
||||||
issue = IssueThread.objects.create(name=subject, event=event,
|
issue = IssueThread.objects.create(name=subject, event=event)
|
||||||
initial_state='pending_suspected_spam' if spam else 'pending_new')
|
|
||||||
return issue, True
|
return issue, True
|
||||||
|
|
||||||
|
|
||||||
|
@ -135,13 +128,10 @@ def decode_email_segment(segment, charset, transfer_encoding):
|
||||||
decode_as = 'cp1251'
|
decode_as = 'cp1251'
|
||||||
elif charset == 'iso-8859-1':
|
elif charset == 'iso-8859-1':
|
||||||
decode_as = 'latin1'
|
decode_as = 'latin1'
|
||||||
|
segment = unescape_and_decode_quoted_printable(segment)
|
||||||
|
segment = unescape_and_decode_base64(segment)
|
||||||
if transfer_encoding == 'quoted-printable':
|
if transfer_encoding == 'quoted-printable':
|
||||||
segment = unescape_simplified_quoted_printable(segment, decode_as)
|
segment = unescape_simplified_quoted_printable(segment, decode_as)
|
||||||
elif transfer_encoding == 'base64':
|
|
||||||
import base64
|
|
||||||
segment = base64.b64decode(segment).decode('utf-8')
|
|
||||||
else:
|
|
||||||
segment = decode_inline_encodings(segment.decode('utf-8'))
|
|
||||||
return segment
|
return segment
|
||||||
|
|
||||||
|
|
||||||
|
@ -166,7 +156,7 @@ def parse_email_body(raw, log=None):
|
||||||
segment = part.get_payload()
|
segment = part.get_payload()
|
||||||
if not segment:
|
if not segment:
|
||||||
continue
|
continue
|
||||||
segment = decode_email_segment(segment.encode('utf-8'), charset, part.get('Content-Transfer-Encoding'))
|
segment = decode_email_segment(segment, charset, part.get('Content-Transfer-Encoding'))
|
||||||
log.debug(segment)
|
log.debug(segment)
|
||||||
body = body + segment
|
body = body + segment
|
||||||
elif 'attachment' in cdispo or 'inline' in cdispo:
|
elif 'attachment' in cdispo or 'inline' in cdispo:
|
||||||
|
@ -199,8 +189,7 @@ def parse_email_body(raw, log=None):
|
||||||
else:
|
else:
|
||||||
log.warning("Unknown content type %s", parsed.get_content_type())
|
log.warning("Unknown content type %s", parsed.get_content_type())
|
||||||
body = "Unknown content type"
|
body = "Unknown content type"
|
||||||
body = decode_email_segment(body.encode('utf-8'), parsed.get_content_charset(),
|
body = decode_email_segment(body, parsed.get_content_charset(), parsed.get('Content-Transfer-Encoding'))
|
||||||
parsed.get('Content-Transfer-Encoding'))
|
|
||||||
log.debug(body)
|
log.debug(body)
|
||||||
|
|
||||||
return parsed, body, attachments
|
return parsed, body, attachments
|
||||||
|
@ -214,8 +203,6 @@ def receive_email(envelope, log=None):
|
||||||
header_to = parsed.get('To')
|
header_to = parsed.get('To')
|
||||||
header_in_reply_to = ascii_strip(parsed.get('In-Reply-To'))
|
header_in_reply_to = ascii_strip(parsed.get('In-Reply-To'))
|
||||||
header_message_id = ascii_strip(parsed.get('Message-ID'))
|
header_message_id = ascii_strip(parsed.get('Message-ID'))
|
||||||
maybe_spam = parsed.get('X-Spam')
|
|
||||||
suspected_spam = (maybe_spam and maybe_spam.lower() == 'yes')
|
|
||||||
|
|
||||||
if match(r'^([a-zA-Z ]*<)?MAILER-DAEMON@', header_from) and envelope.mail_from.strip("<>") == "":
|
if match(r'^([a-zA-Z ]*<)?MAILER-DAEMON@', header_from) and envelope.mail_from.strip("<>") == "":
|
||||||
log.warning("Ignoring mailer daemon")
|
log.warning("Ignoring mailer daemon")
|
||||||
|
@ -223,20 +210,18 @@ def receive_email(envelope, log=None):
|
||||||
|
|
||||||
if Email.objects.filter(reference=header_message_id).exists(): # break before issue thread is created
|
if Email.objects.filter(reference=header_message_id).exists(): # break before issue thread is created
|
||||||
log.warning("Email already exists")
|
log.warning("Email already exists")
|
||||||
raise SpecialMailException("Email already exists")
|
raise Exception("Email already exists")
|
||||||
|
|
||||||
recipient = envelope.rcpt_tos[0].lower() if envelope.rcpt_tos else header_to.lower()
|
recipient = envelope.rcpt_tos[0].lower() if envelope.rcpt_tos else header_to.lower()
|
||||||
sender = envelope.mail_from if envelope.mail_from else header_from
|
sender = envelope.mail_from if envelope.mail_from else header_from
|
||||||
subject = ascii_strip(parsed.get('Subject'))
|
subject = ascii_strip(parsed.get('Subject'))
|
||||||
if not subject:
|
if not subject:
|
||||||
subject = "No subject"
|
subject = "No subject"
|
||||||
subject = decode_inline_encodings(subject)
|
subject = unescape_and_decode_quoted_printable(subject)
|
||||||
recipient = decode_inline_encodings(recipient)
|
subject = unescape_and_decode_base64(subject)
|
||||||
sender = decode_inline_encodings(sender)
|
|
||||||
target_event = find_target_event(recipient)
|
target_event = find_target_event(recipient)
|
||||||
|
|
||||||
active_issue_thread, new = find_active_issue_thread(
|
active_issue_thread, new = find_active_issue_thread(header_in_reply_to, recipient, subject, target_event)
|
||||||
header_in_reply_to, recipient, subject, target_event, suspected_spam)
|
|
||||||
|
|
||||||
from hashlib import sha256
|
from hashlib import sha256
|
||||||
random_filename = 'mail-' + sha256(envelope.content).hexdigest()
|
random_filename = 'mail-' + sha256(envelope.content).hexdigest()
|
||||||
|
@ -254,7 +239,7 @@ def receive_email(envelope, log=None):
|
||||||
if new:
|
if new:
|
||||||
# auto reply if new issue
|
# auto reply if new issue
|
||||||
references = collect_references(active_issue_thread)
|
references = collect_references(active_issue_thread)
|
||||||
if not sender.startswith('noreply') and not sender.startswith('no-reply') and not suspected_spam:
|
if not sender.startswith('noreply'):
|
||||||
subject = f"Re: {subject} [#{active_issue_thread.short_uuid()}]"
|
subject = f"Re: {subject} [#{active_issue_thread.short_uuid()}]"
|
||||||
body = '''Your request (#{}) has been received and will be reviewed by our lost&found angels.
|
body = '''Your request (#{}) has been received and will be reviewed by our lost&found angels.
|
||||||
|
|
||||||
|
@ -267,7 +252,7 @@ do not create a new request.
|
||||||
|
|
||||||
Your c3lf (Cloakroom + Lost&Found) Team'''.format(active_issue_thread.short_uuid())
|
Your c3lf (Cloakroom + Lost&Found) Team'''.format(active_issue_thread.short_uuid())
|
||||||
reply_email = Email.objects.create(
|
reply_email = Email.objects.create(
|
||||||
sender=recipient, recipient=sender, body=body, subject=subject,
|
sender=recipient, recipient=sender, body=body, subject=ascii_strip(subject),
|
||||||
in_reply_to=header_message_id, event=target_event, issue_thread=active_issue_thread)
|
in_reply_to=header_message_id, event=target_event, issue_thread=active_issue_thread)
|
||||||
reply = make_reply(reply_email, references, event=target_event.slug if target_event else None)
|
reply = make_reply(reply_email, references, event=target_event.slug if target_event else None)
|
||||||
else:
|
else:
|
||||||
|
@ -303,10 +288,10 @@ class LMTPHandler:
|
||||||
systemevent = await database_sync_to_async(SystemEvent.objects.create)(type='email received',
|
systemevent = await database_sync_to_async(SystemEvent.objects.create)(type='email received',
|
||||||
reference=email.id)
|
reference=email.id)
|
||||||
log.info(f"Created system event {systemevent.id}")
|
log.info(f"Created system event {systemevent.id}")
|
||||||
#channel_layer = get_channel_layer()
|
channel_layer = get_channel_layer()
|
||||||
#await channel_layer.group_send(
|
await channel_layer.group_send(
|
||||||
# 'general', {"type": "generic.event", "name": "send_message_to_frontend", "event_id": systemevent.id,
|
'general', {"type": "generic.event", "name": "send_message_to_frontend", "event_id": systemevent.id,
|
||||||
# "message": "email received"})
|
"message": "email received"})
|
||||||
log.info(f"Sent message to frontend")
|
log.info(f"Sent message to frontend")
|
||||||
if new and reply:
|
if new and reply:
|
||||||
log.info('Sending message to %s' % reply['To'])
|
log.info('Sending message to %s' % reply['To'])
|
||||||
|
|
|
@ -165,7 +165,7 @@ class LMTPHandlerTestCase(TestCase): # TODO replace with less hacky test
|
||||||
self.assertEqual('Text mit Quoted-Printable-Kodierung: äöüß', Email.objects.all()[0].body)
|
self.assertEqual('Text mit Quoted-Printable-Kodierung: äöüß', Email.objects.all()[0].body)
|
||||||
self.assertTrue(Email.objects.all()[0].raw_file.path)
|
self.assertTrue(Email.objects.all()[0].raw_file.path)
|
||||||
|
|
||||||
def test_handle_base64_inline(self):
|
def test_handle_base64(self):
|
||||||
from aiosmtpd.smtp import Envelope
|
from aiosmtpd.smtp import Envelope
|
||||||
from asgiref.sync import async_to_sync
|
from asgiref.sync import async_to_sync
|
||||||
import aiosmtplib
|
import aiosmtplib
|
||||||
|
@ -186,35 +186,6 @@ class LMTPHandlerTestCase(TestCase): # TODO replace with less hacky test
|
||||||
self.assertEqual('Text mit Base64-Kodierung: äöüß', Email.objects.all()[0].body)
|
self.assertEqual('Text mit Base64-Kodierung: äöüß', Email.objects.all()[0].body)
|
||||||
self.assertTrue(Email.objects.all()[0].raw_file.path)
|
self.assertTrue(Email.objects.all()[0].raw_file.path)
|
||||||
|
|
||||||
def test_handle_base64_transfer_encoding(self):
|
|
||||||
from aiosmtpd.smtp import Envelope
|
|
||||||
from asgiref.sync import async_to_sync
|
|
||||||
import aiosmtplib
|
|
||||||
aiosmtplib.send = make_mocked_coro()
|
|
||||||
handler = LMTPHandler()
|
|
||||||
server = mock.Mock()
|
|
||||||
session = mock.Mock()
|
|
||||||
envelope = Envelope()
|
|
||||||
envelope.mail_from = 'test1@test'
|
|
||||||
envelope.rcpt_tos = ['test2@test']
|
|
||||||
envelope.content = b'''Subject: test
|
|
||||||
From: test3@test
|
|
||||||
To: test4@test
|
|
||||||
Message-ID: <1@test>
|
|
||||||
Content-Type: text/plain; charset=utf-8
|
|
||||||
Content-Transfer-Encoding: base64
|
|
||||||
|
|
||||||
VGVzdCBtaXQgQmFzZTY0LUtvZGllcnVuZzogw6TDtsO8w58='''
|
|
||||||
|
|
||||||
result = async_to_sync(handler.handle_DATA)(server, session, envelope)
|
|
||||||
self.assertEqual(result, '250 Message accepted for delivery')
|
|
||||||
self.assertEqual(len(Email.objects.all()), 2)
|
|
||||||
self.assertEqual(len(IssueThread.objects.all()), 1)
|
|
||||||
aiosmtplib.send.assert_called_once()
|
|
||||||
self.assertEqual('test', Email.objects.all()[0].subject)
|
|
||||||
self.assertEqual('Test mit Base64-Kodierung: äöüß', Email.objects.all()[0].body)
|
|
||||||
self.assertTrue(Email.objects.all()[0].raw_file.path)
|
|
||||||
|
|
||||||
def test_handle_client_reply(self):
|
def test_handle_client_reply(self):
|
||||||
issue_thread = IssueThread.objects.create(
|
issue_thread = IssueThread.objects.create(
|
||||||
name="test",
|
name="test",
|
||||||
|
@ -812,44 +783,6 @@ dGVzdGltYWdl
|
||||||
self.assertEqual(None, IssueThread.objects.all()[0].assigned_to)
|
self.assertEqual(None, IssueThread.objects.all()[0].assigned_to)
|
||||||
aiosmtplib.send.assert_called_once()
|
aiosmtplib.send.assert_called_once()
|
||||||
|
|
||||||
def test_mail_spam_header(self):
|
|
||||||
from aiosmtpd.smtp import Envelope
|
|
||||||
from asgiref.sync import async_to_sync
|
|
||||||
import aiosmtplib
|
|
||||||
aiosmtplib.send = make_mocked_coro()
|
|
||||||
handler = LMTPHandler()
|
|
||||||
server = mock.Mock()
|
|
||||||
session = mock.Mock()
|
|
||||||
envelope = Envelope()
|
|
||||||
envelope.mail_from = 'test1@test'
|
|
||||||
envelope.rcpt_tos = ['test2@test']
|
|
||||||
envelope.content = b'''Subject: test
|
|
||||||
From: test1@test
|
|
||||||
To: test2@test
|
|
||||||
Message-ID: <1@test>
|
|
||||||
X-Spam: Yes
|
|
||||||
|
|
||||||
test'''
|
|
||||||
result = async_to_sync(handler.handle_DATA)(server, session, envelope)
|
|
||||||
|
|
||||||
self.assertEqual(result, '250 Message accepted for delivery')
|
|
||||||
self.assertEqual(len(Email.objects.all()), 1) # do not send auto reply if spam is suspected
|
|
||||||
self.assertEqual(len(IssueThread.objects.all()), 1)
|
|
||||||
aiosmtplib.send.assert_not_called()
|
|
||||||
self.assertEqual('test', Email.objects.all()[0].subject)
|
|
||||||
self.assertEqual('test1@test', Email.objects.all()[0].sender)
|
|
||||||
self.assertEqual('test2@test', Email.objects.all()[0].recipient)
|
|
||||||
self.assertEqual('test', Email.objects.all()[0].body)
|
|
||||||
self.assertEqual(IssueThread.objects.all()[0], Email.objects.all()[0].issue_thread)
|
|
||||||
self.assertEqual('<1@test>', Email.objects.all()[0].reference)
|
|
||||||
self.assertEqual(None, Email.objects.all()[0].in_reply_to)
|
|
||||||
self.assertEqual('test', IssueThread.objects.all()[0].name)
|
|
||||||
self.assertEqual('pending_suspected_spam', IssueThread.objects.all()[0].state)
|
|
||||||
self.assertEqual(None, IssueThread.objects.all()[0].assigned_to)
|
|
||||||
states = StateChange.objects.filter(issue_thread=IssueThread.objects.all()[0])
|
|
||||||
self.assertEqual(1, len(states))
|
|
||||||
self.assertEqual('pending_suspected_spam', states[0].state)
|
|
||||||
|
|
||||||
def test_mail_4byte_unicode_emoji(self):
|
def test_mail_4byte_unicode_emoji(self):
|
||||||
from aiosmtpd.smtp import Envelope
|
from aiosmtpd.smtp import Envelope
|
||||||
from asgiref.sync import async_to_sync
|
from asgiref.sync import async_to_sync
|
||||||
|
|
|
@ -13,7 +13,7 @@ Automat==22.10.0
|
||||||
beautifulsoup4==4.12.2
|
beautifulsoup4==4.12.2
|
||||||
bs4==0.0.1
|
bs4==0.0.1
|
||||||
certifi==2023.11.17
|
certifi==2023.11.17
|
||||||
#cffi==1.16.0
|
cffi==1.16.0
|
||||||
channels==4.0.0
|
channels==4.0.0
|
||||||
channels-redis==4.1.0
|
channels-redis==4.1.0
|
||||||
charset-normalizer==3.3.2
|
charset-normalizer==3.3.2
|
||||||
|
@ -40,12 +40,12 @@ inflection==0.5.1
|
||||||
itypes==1.2.0
|
itypes==1.2.0
|
||||||
Jinja2==3.1.2
|
Jinja2==3.1.2
|
||||||
MarkupSafe==2.1.3
|
MarkupSafe==2.1.3
|
||||||
#msgpack==1.0.7
|
msgpack==1.0.7
|
||||||
#msgpack-python==0.5.6
|
msgpack-python==0.5.6
|
||||||
multidict==6.0.5
|
multidict==6.0.5
|
||||||
openapi-codec==1.3.2
|
openapi-codec==1.3.2
|
||||||
packaging==23.2
|
packaging==23.2
|
||||||
Pillow==11.1.0
|
Pillow==10.1.0
|
||||||
pyasn1==0.5.1
|
pyasn1==0.5.1
|
||||||
pyasn1-modules==0.3.0
|
pyasn1-modules==0.3.0
|
||||||
pycares==4.4.0
|
pycares==4.4.0
|
||||||
|
@ -69,6 +69,7 @@ typing_extensions==4.8.0
|
||||||
uritemplate==4.1.1
|
uritemplate==4.1.1
|
||||||
urllib3==2.1.0
|
urllib3==2.1.0
|
||||||
uvicorn==0.24.0.post1
|
uvicorn==0.24.0.post1
|
||||||
|
watchfiles==0.21.0
|
||||||
websockets==12.0
|
websockets==12.0
|
||||||
yarl==1.9.4
|
yarl==1.9.4
|
||||||
zope.interface==6.1
|
zope.interface==6.1
|
||||||
|
|
|
@ -102,6 +102,12 @@ def manual_ticket(request, event_slug):
|
||||||
subject=request.data['name'],
|
subject=request.data['name'],
|
||||||
body=request.data['body'],
|
body=request.data['body'],
|
||||||
)
|
)
|
||||||
|
systemevent = SystemEvent.objects.create(type='email received', reference=email.id)
|
||||||
|
channel_layer = get_channel_layer()
|
||||||
|
async_to_sync(channel_layer.group_send)(
|
||||||
|
'general', {"type": "generic.event", "name": "send_message_to_frontend", "event_id": systemevent.id,
|
||||||
|
"message": "email received"}
|
||||||
|
)
|
||||||
|
|
||||||
return Response(IssueSerializer(issue).data, status=status.HTTP_201_CREATED)
|
return Response(IssueSerializer(issue).data, status=status.HTTP_201_CREATED)
|
||||||
|
|
||||||
|
@ -127,75 +133,48 @@ def add_comment(request, pk):
|
||||||
issue_thread=issue,
|
issue_thread=issue,
|
||||||
comment=request.data['comment'],
|
comment=request.data['comment'],
|
||||||
)
|
)
|
||||||
|
systemevent = SystemEvent.objects.create(type='comment added', reference=comment.id)
|
||||||
|
channel_layer = get_channel_layer()
|
||||||
|
async_to_sync(channel_layer.group_send)(
|
||||||
|
'general', {"type": "generic.event", "name": "send_message_to_frontend", "event_id": systemevent.id,
|
||||||
|
"message": "comment added"}
|
||||||
|
)
|
||||||
return Response(CommentSerializer(comment).data, status=status.HTTP_201_CREATED)
|
return Response(CommentSerializer(comment).data, status=status.HTTP_201_CREATED)
|
||||||
|
|
||||||
|
|
||||||
def filter_issues(issues, query):
|
def filter_issues(issues, query):
|
||||||
query_tokens = query.lower().split(' ')
|
query_tokens = query.lower().split(' ')
|
||||||
matches = []
|
|
||||||
for issue in issues:
|
for issue in issues:
|
||||||
value = 0
|
value = 0
|
||||||
if "T#" + issue.short_uuid() in query:
|
if issue.short_uuid() in query:
|
||||||
value += 12
|
|
||||||
matches.append(
|
|
||||||
{'type': 'ticket_uuid', 'text': f'is exactly {issue.short_uuid()} and matched "T#{issue.short_uuid()}"'})
|
|
||||||
elif "#" + issue.short_uuid() in query:
|
|
||||||
value += 11
|
|
||||||
matches.append(
|
|
||||||
{'type': 'ticket_uuid', 'text': f'is exactly {issue.short_uuid()} and matched "#{issue.short_uuid()}"'})
|
|
||||||
elif issue.short_uuid() in query:
|
|
||||||
value += 10
|
value += 10
|
||||||
matches.append({'type': 'ticket_uuid', 'text': f'is exactly {issue.short_uuid()}'})
|
|
||||||
if "T#" + str(issue.id) in query:
|
if "T#" + str(issue.id) in query:
|
||||||
value += 10
|
value += 10
|
||||||
matches.append({'type': 'ticket_id', 'text': f'is exactly {issue.id} and matched "T#{issue.id}"'})
|
|
||||||
elif "#" + str(issue.id) in query:
|
elif "#" + str(issue.id) in query:
|
||||||
value += 7
|
value += 9
|
||||||
matches.append({'type': 'ticket_id', 'text': f'is exactly {issue.id} and matched "#{issue.id}"'})
|
|
||||||
elif str(issue.id) in query:
|
|
||||||
value += 4
|
|
||||||
matches.append({'type': 'ticket_id', 'text': f'is exactly {issue.id}'})
|
|
||||||
for item in issue.related_items:
|
for item in issue.related_items:
|
||||||
if "I#" + str(item.id) in query:
|
if "I#" + str(item.id) in query:
|
||||||
value += 8
|
value += 8
|
||||||
matches.append({'type': 'item_id', 'text': f'is exactly {item.id} and matched "I#{item.id}"'})
|
|
||||||
elif "#" + str(item.id) in query:
|
elif "#" + str(item.id) in query:
|
||||||
value += 5
|
value += 5
|
||||||
matches.append({'type': 'item_id', 'text': f'is exactly {item.id} and matched "#{item.id}"'})
|
|
||||||
elif str(item.id) in query:
|
|
||||||
value += 3
|
|
||||||
matches.append({'type': 'item_id', 'text': f'is exactly {item.id}'})
|
|
||||||
for token in query_tokens:
|
for token in query_tokens:
|
||||||
if token in item.description.lower():
|
if token in item.description.lower():
|
||||||
value += 1
|
value += 1
|
||||||
matches.append({'type': 'item_description', 'text': f'contains {token}'})
|
|
||||||
for comment in item.comments.all():
|
|
||||||
for token in query_tokens:
|
|
||||||
if token in comment.comment.lower():
|
|
||||||
value += 1
|
|
||||||
matches.append({'type': 'item_comment', 'text': f'contains {token}'})
|
|
||||||
for token in query_tokens:
|
for token in query_tokens:
|
||||||
if token in issue.name.lower():
|
if token in issue.name.lower():
|
||||||
value += 1
|
value += 1
|
||||||
matches.append({'type': 'ticket_name', 'text': f'contains {token}'})
|
|
||||||
for comment in issue.comments.all():
|
for comment in issue.comments.all():
|
||||||
for token in query_tokens:
|
for token in query_tokens:
|
||||||
if token in comment.comment.lower():
|
if token in comment.comment.lower():
|
||||||
value += 1
|
value += 1
|
||||||
matches.append({'type': 'ticket_comment', 'text': f'contains {token}'})
|
|
||||||
for email in issue.emails.all():
|
for email in issue.emails.all():
|
||||||
for token in query_tokens:
|
for token in query_tokens:
|
||||||
if token in email.subject.lower():
|
if token in email.subject.lower():
|
||||||
value += 1
|
value += 1
|
||||||
matches.append({'type': 'email_subject', 'text': f'contains {token}'})
|
|
||||||
if token in email.body.lower():
|
if token in email.body.lower():
|
||||||
value += 1
|
value += 1
|
||||||
matches.append({'type': 'email_body', 'text': f'contains {token}'})
|
|
||||||
if token in email.sender.lower():
|
|
||||||
value += 1
|
|
||||||
matches.append({'type': 'email_sender', 'text': f'contains {token}'})
|
|
||||||
if value > 0:
|
if value > 0:
|
||||||
yield {'search_score': value, 'issue': issue, 'search_matches': matches}
|
yield {'search_score': value, 'issue': issue}
|
||||||
|
|
||||||
|
|
||||||
@api_view(['GET'])
|
@api_view(['GET'])
|
||||||
|
|
|
@ -1,18 +0,0 @@
|
||||||
# Generated by Django 4.2.7 on 2025-03-15 21:31
|
|
||||||
|
|
||||||
from django.db import migrations, models
|
|
||||||
|
|
||||||
|
|
||||||
class Migration(migrations.Migration):
|
|
||||||
|
|
||||||
dependencies = [
|
|
||||||
('tickets', '0012_remove_issuethread_related_items_and_more'),
|
|
||||||
]
|
|
||||||
|
|
||||||
operations = [
|
|
||||||
migrations.AlterField(
|
|
||||||
model_name='statechange',
|
|
||||||
name='state',
|
|
||||||
field=models.CharField(choices=[('pending_new', 'New'), ('pending_open', 'Open'), ('pending_shipping', 'Needs to be shipped'), ('pending_physical_confirmation', 'Needs to be confirmed physically'), ('pending_return', 'Needs to be returned'), ('pending_postponed', 'Postponed'), ('pending_suspected_spam', 'Suspected Spam'), ('waiting_details', 'Waiting for details'), ('waiting_pre_shipping', 'Waiting for Address/Shipping Info'), ('closed_returned', 'Closed: Returned'), ('closed_shipped', 'Closed: Shipped'), ('closed_not_found', 'Closed: Not found'), ('closed_not_our_problem', 'Closed: Not our problem'), ('closed_duplicate', 'Closed: Duplicate'), ('closed_timeout', 'Closed: Timeout'), ('closed_spam', 'Closed: Spam'), ('closed_nothing_missing', 'Closed: Nothing missing'), ('closed_wtf', 'Closed: WTF'), ('found_open', 'Item Found and stored externally'), ('found_closed', 'Item Found and stored externally and closed')], default='pending_new', max_length=255),
|
|
||||||
),
|
|
||||||
]
|
|
|
@ -16,7 +16,6 @@ STATE_CHOICES = (
|
||||||
('pending_physical_confirmation', 'Needs to be confirmed physically'),
|
('pending_physical_confirmation', 'Needs to be confirmed physically'),
|
||||||
('pending_return', 'Needs to be returned'),
|
('pending_return', 'Needs to be returned'),
|
||||||
('pending_postponed', 'Postponed'),
|
('pending_postponed', 'Postponed'),
|
||||||
('pending_suspected_spam', 'Suspected Spam'),
|
|
||||||
('waiting_details', 'Waiting for details'),
|
('waiting_details', 'Waiting for details'),
|
||||||
('waiting_pre_shipping', 'Waiting for Address/Shipping Info'),
|
('waiting_pre_shipping', 'Waiting for Address/Shipping Info'),
|
||||||
('closed_returned', 'Closed: Returned'),
|
('closed_returned', 'Closed: Returned'),
|
||||||
|
@ -47,11 +46,6 @@ class IssueThread(SoftDeleteModel):
|
||||||
event = models.ForeignKey(Event, null=True, on_delete=models.SET_NULL, related_name='issue_threads')
|
event = models.ForeignKey(Event, null=True, on_delete=models.SET_NULL, related_name='issue_threads')
|
||||||
manually_created = models.BooleanField(default=False)
|
manually_created = models.BooleanField(default=False)
|
||||||
|
|
||||||
def __init__(self, *args, **kwargs):
|
|
||||||
if 'initial_state' in kwargs:
|
|
||||||
self._initial_state = kwargs.pop('initial_state')
|
|
||||||
super().__init__(*args, **kwargs)
|
|
||||||
|
|
||||||
def short_uuid(self):
|
def short_uuid(self):
|
||||||
return self.uuid[:8]
|
return self.uuid[:8]
|
||||||
|
|
||||||
|
@ -116,9 +110,8 @@ def set_uuid(sender, instance, **kwargs):
|
||||||
|
|
||||||
@receiver(post_save, sender=IssueThread)
|
@receiver(post_save, sender=IssueThread)
|
||||||
def create_issue_thread(sender, instance, created, **kwargs):
|
def create_issue_thread(sender, instance, created, **kwargs):
|
||||||
if created and instance.state_changes.count() == 0:
|
if created:
|
||||||
initial_state = getattr(instance, '_initial_state', None)
|
StateChange.objects.create(issue_thread=instance, state='pending_new')
|
||||||
StateChange.objects.create(issue_thread=instance, state=initial_state if initial_state else 'pending_new')
|
|
||||||
|
|
||||||
|
|
||||||
class Comment(models.Model):
|
class Comment(models.Model):
|
||||||
|
|
|
@ -139,12 +139,10 @@ class IssueSerializer(BasicIssueSerializer):
|
||||||
|
|
||||||
class SearchResultSerializer(serializers.Serializer):
|
class SearchResultSerializer(serializers.Serializer):
|
||||||
search_score = serializers.IntegerField()
|
search_score = serializers.IntegerField()
|
||||||
search_matches = serializers.ListField(child=serializers.DictField())
|
|
||||||
issue = IssueSerializer()
|
issue = IssueSerializer()
|
||||||
|
|
||||||
def to_representation(self, instance):
|
def to_representation(self, instance):
|
||||||
return {**IssueSerializer(instance['issue']).data, 'search_score': instance['search_score'],
|
return {**IssueSerializer(instance['issue']).data, 'search_score': instance['search_score']}
|
||||||
'search_matches': instance['search_matches']}
|
|
||||||
|
|
||||||
class Meta:
|
class Meta:
|
||||||
model = IssueThread
|
model = IssueThread
|
||||||
|
|
|
@ -9,7 +9,6 @@ class RelationSerializer(serializers.ModelSerializer):
|
||||||
class Meta:
|
class Meta:
|
||||||
model = ItemRelation
|
model = ItemRelation
|
||||||
fields = ('id', 'status', 'timestamp', 'item', 'issue_thread')
|
fields = ('id', 'status', 'timestamp', 'item', 'issue_thread')
|
||||||
read_only_fields = ('id', 'timestamp')
|
|
||||||
|
|
||||||
|
|
||||||
class BasicIssueSerializer(serializers.ModelSerializer):
|
class BasicIssueSerializer(serializers.ModelSerializer):
|
||||||
|
|
|
@ -4,7 +4,6 @@ from django.test import TestCase, Client
|
||||||
|
|
||||||
from authentication.models import ExtendedUser
|
from authentication.models import ExtendedUser
|
||||||
from inventory.models import Event, Container, Item
|
from inventory.models import Event, Container, Item
|
||||||
from inventory.models import Comment as ItemComment
|
|
||||||
from mail.models import Email, EmailAttachment
|
from mail.models import Email, EmailAttachment
|
||||||
from tickets.models import IssueThread, StateChange, Comment, ItemRelation, Assignment
|
from tickets.models import IssueThread, StateChange, Comment, ItemRelation, Assignment
|
||||||
from django.contrib.auth.models import Permission
|
from django.contrib.auth.models import Permission
|
||||||
|
@ -408,16 +407,16 @@ class IssueSearchTest(TestCase):
|
||||||
mail1 = Email.objects.create(
|
mail1 = Email.objects.create(
|
||||||
subject='test',
|
subject='test',
|
||||||
body='test aBc',
|
body='test aBc',
|
||||||
sender='bar@test',
|
sender='test',
|
||||||
recipient='2@test',
|
recipient='test',
|
||||||
issue_thread=issue,
|
issue_thread=issue,
|
||||||
timestamp=now,
|
timestamp=now,
|
||||||
)
|
)
|
||||||
mail2 = Email.objects.create(
|
mail2 = Email.objects.create(
|
||||||
subject='Re: test',
|
subject='test',
|
||||||
body='test',
|
body='test',
|
||||||
sender='2@test',
|
sender='test',
|
||||||
recipient='1@test',
|
recipient='test',
|
||||||
issue_thread=issue,
|
issue_thread=issue,
|
||||||
in_reply_to=mail1.reference,
|
in_reply_to=mail1.reference,
|
||||||
timestamp=now + timedelta(seconds=2),
|
timestamp=now + timedelta(seconds=2),
|
||||||
|
@ -437,11 +436,6 @@ class IssueSearchTest(TestCase):
|
||||||
item=self.item,
|
item=self.item,
|
||||||
timestamp=now + timedelta(seconds=5),
|
timestamp=now + timedelta(seconds=5),
|
||||||
)
|
)
|
||||||
item_comment = ItemComment.objects.create(
|
|
||||||
item=self.item,
|
|
||||||
comment="baz",
|
|
||||||
timestamp=now + timedelta(seconds=6),
|
|
||||||
)
|
|
||||||
search_query = b64encode(b'abC').decode('utf-8')
|
search_query = b64encode(b'abC').decode('utf-8')
|
||||||
response = self.client.get(f'/api/2/{self.event.slug}/tickets/{search_query}/')
|
response = self.client.get(f'/api/2/{self.event.slug}/tickets/{search_query}/')
|
||||||
self.assertEqual(200, response.status_code)
|
self.assertEqual(200, response.status_code)
|
||||||
|
@ -471,21 +465,3 @@ class IssueSearchTest(TestCase):
|
||||||
self.assertGreater(score3, score2)
|
self.assertGreater(score3, score2)
|
||||||
self.assertGreater(score2, score1)
|
self.assertGreater(score2, score1)
|
||||||
self.assertGreater(score1, 0)
|
self.assertGreater(score1, 0)
|
||||||
|
|
||||||
search_query = b64encode(b'foo').decode('utf-8')
|
|
||||||
response = self.client.get(f'/api/2/{self.event.slug}/tickets/{search_query}/')
|
|
||||||
self.assertEqual(200, response.status_code)
|
|
||||||
self.assertEqual(1, len(response.json()))
|
|
||||||
self.assertEqual(issue.id, response.json()[0]['id'])
|
|
||||||
|
|
||||||
search_query = b64encode(b'bar').decode('utf-8')
|
|
||||||
response = self.client.get(f'/api/2/{self.event.slug}/tickets/{search_query}/')
|
|
||||||
self.assertEqual(200, response.status_code)
|
|
||||||
self.assertEqual(1, len(response.json()))
|
|
||||||
self.assertEqual(issue.id, response.json()[0]['id'])
|
|
||||||
|
|
||||||
search_query = b64encode(b'baz').decode('utf-8')
|
|
||||||
response = self.client.get(f'/api/2/{self.event.slug}/tickets/{search_query}/')
|
|
||||||
self.assertEqual(200, response.status_code)
|
|
||||||
self.assertEqual(1, len(response.json()))
|
|
||||||
self.assertEqual(issue.id, response.json()[0]['id'])
|
|
||||||
|
|
|
@ -1,8 +1,13 @@
|
||||||
FROM python:3.11-slim-bookworm
|
FROM python:3.11-bookworm
|
||||||
LABEL authors="lagertonne"
|
LABEL authors="lagertonne"
|
||||||
|
|
||||||
ENV PYTHONUNBUFFERED 1
|
ENV PYTHONUNBUFFERED 1
|
||||||
RUN mkdir /code
|
RUN mkdir /code
|
||||||
WORKDIR /code
|
WORKDIR /code
|
||||||
COPY requirements.dev.txt /code/
|
COPY requirements.dev.txt /code/
|
||||||
RUN pip install -r requirements.dev.txt
|
COPY requirements.prod.txt /code/
|
||||||
|
RUN apt update && apt install -y mariadb-client
|
||||||
|
RUN pip install -r requirements.dev.txt
|
||||||
|
RUN pip install -r requirements.prod.txt
|
||||||
|
RUN pip install mysqlclient
|
||||||
|
COPY .. /code/
|
|
@ -1,4 +1,4 @@
|
||||||
FROM node:22-alpine
|
FROM docker.io/node:22
|
||||||
|
|
||||||
RUN mkdir /web
|
RUN mkdir /web
|
||||||
WORKDIR /web
|
WORKDIR /web
|
||||||
|
|
|
@ -1,4 +1,3 @@
|
||||||
name: c3lf-sys3-dev
|
|
||||||
services:
|
services:
|
||||||
core:
|
core:
|
||||||
build:
|
build:
|
||||||
|
@ -7,12 +6,10 @@ services:
|
||||||
command: bash -c 'python manage.py migrate && python testdata.py && python manage.py runserver 0.0.0.0:8000'
|
command: bash -c 'python manage.py migrate && python testdata.py && python manage.py runserver 0.0.0.0:8000'
|
||||||
environment:
|
environment:
|
||||||
- HTTP_HOST=core
|
- HTTP_HOST=core
|
||||||
- DB_FILE=.local/dev.db
|
- DB_FILE=dev.db
|
||||||
- DEBUG_MODE_ACTIVE=true
|
|
||||||
volumes:
|
volumes:
|
||||||
- ../../core:/code:ro
|
- ../../core:/code
|
||||||
- ../testdata.py:/code/testdata.py:ro
|
- ../testdata.py:/code/testdata.py
|
||||||
- backend_context:/code/.local
|
|
||||||
ports:
|
ports:
|
||||||
- "8000:8000"
|
- "8000:8000"
|
||||||
|
|
||||||
|
@ -22,12 +19,10 @@ services:
|
||||||
dockerfile: ../deploy/dev/Dockerfile.frontend
|
dockerfile: ../deploy/dev/Dockerfile.frontend
|
||||||
command: npm run serve
|
command: npm run serve
|
||||||
volumes:
|
volumes:
|
||||||
- ../../web/src:/web/src
|
- ../../web:/web:ro
|
||||||
|
- /web/node_modules
|
||||||
- ./vue.config.js:/web/vue.config.js
|
- ./vue.config.js:/web/vue.config.js
|
||||||
ports:
|
ports:
|
||||||
- "8080:8080"
|
- "8080:8080"
|
||||||
depends_on:
|
depends_on:
|
||||||
- core
|
- core
|
||||||
|
|
||||||
volumes:
|
|
||||||
backend_context:
|
|
|
@ -1,11 +1,11 @@
|
||||||
FROM python:3.11-slim-bookworm
|
FROM python:3.11-bookworm
|
||||||
LABEL authors="lagertonne"
|
LABEL authors="lagertonne"
|
||||||
|
|
||||||
ENV PYTHONUNBUFFERED 1
|
ENV PYTHONUNBUFFERED 1
|
||||||
RUN mkdir /code
|
RUN mkdir /code
|
||||||
WORKDIR /code
|
WORKDIR /code
|
||||||
RUN apt update && apt install -y pkg-config mariadb-client default-libmysqlclient-dev build-essential
|
|
||||||
RUN pip install mysqlclient
|
|
||||||
COPY requirements.prod.txt /code/
|
COPY requirements.prod.txt /code/
|
||||||
|
RUN apt update && apt install -y mariadb-client
|
||||||
RUN pip install -r requirements.prod.txt
|
RUN pip install -r requirements.prod.txt
|
||||||
|
RUN pip install mysqlclient
|
||||||
COPY .. /code/
|
COPY .. /code/
|
|
@ -1,4 +1,4 @@
|
||||||
FROM node:22-alpine
|
FROM docker.io/node:22
|
||||||
|
|
||||||
RUN mkdir /web
|
RUN mkdir /web
|
||||||
WORKDIR /web
|
WORKDIR /web
|
||||||
|
|
|
@ -1,4 +1,3 @@
|
||||||
name: c3lf-sys3-testing
|
|
||||||
services:
|
services:
|
||||||
redis:
|
redis:
|
||||||
image: redis
|
image: redis
|
||||||
|
@ -32,9 +31,8 @@ services:
|
||||||
- DB_PASSWORD=system3
|
- DB_PASSWORD=system3
|
||||||
- MAIL_DOMAIN=mail:1025
|
- MAIL_DOMAIN=mail:1025
|
||||||
volumes:
|
volumes:
|
||||||
- ../../core:/code:ro
|
- ../../core:/code
|
||||||
- ../testdata.py:/code/testdata.py:ro
|
- ../testdata.py:/code/testdata.py
|
||||||
- backend_context:/code
|
|
||||||
ports:
|
ports:
|
||||||
- "8000:8000"
|
- "8000:8000"
|
||||||
depends_on:
|
depends_on:
|
||||||
|
@ -49,8 +47,8 @@ services:
|
||||||
command: npm run serve
|
command: npm run serve
|
||||||
volumes:
|
volumes:
|
||||||
- ../../web:/web:ro
|
- ../../web:/web:ro
|
||||||
- ./vue.config.js:/web/vue.config.js:ro
|
- /web/node_modules
|
||||||
- frontend_context:/web
|
- ./vue.config.js:/web/vue.config.js
|
||||||
ports:
|
ports:
|
||||||
- "8080:8080"
|
- "8080:8080"
|
||||||
depends_on:
|
depends_on:
|
||||||
|
@ -72,5 +70,3 @@ services:
|
||||||
volumes:
|
volumes:
|
||||||
mariadb_data:
|
mariadb_data:
|
||||||
mailpit_data:
|
mailpit_data:
|
||||||
frontend_context:
|
|
||||||
backend_context:
|
|
||||||
|
|
9399
web/package-lock.json
generated
9399
web/package-lock.json
generated
File diff suppressed because it is too large
Load diff
|
@ -24,15 +24,6 @@
|
||||||
<span class="timeline-item-icon faded-icon" v-else-if="item.type === 'placement'">
|
<span class="timeline-item-icon faded-icon" v-else-if="item.type === 'placement'">
|
||||||
<font-awesome-icon icon="archive"/>
|
<font-awesome-icon icon="archive"/>
|
||||||
</span>
|
</span>
|
||||||
<span class="timeline-item-icon faded-icon" v-else-if="item.type === 'created'">
|
|
||||||
<font-awesome-icon icon="archive"/>
|
|
||||||
</span>
|
|
||||||
<span class="timeline-item-icon faded-icon" v-else-if="item.type === 'returned'">
|
|
||||||
<font-awesome-icon icon="archive"/>
|
|
||||||
</span>
|
|
||||||
<span class="timeline-item-icon faded-icon" v-else-if="item.type === 'deleted'">
|
|
||||||
<font-awesome-icon icon="trash"/>
|
|
||||||
</span>
|
|
||||||
<span class="timeline-item-icon faded-icon" v-else>
|
<span class="timeline-item-icon faded-icon" v-else>
|
||||||
<font-awesome-icon icon="pen"/>
|
<font-awesome-icon icon="pen"/>
|
||||||
</span>
|
</span>
|
||||||
|
@ -44,9 +35,6 @@
|
||||||
<TimelineShippingVoucher v-else-if="item.type === 'shipping_voucher'" :item="item"/>
|
<TimelineShippingVoucher v-else-if="item.type === 'shipping_voucher'" :item="item"/>
|
||||||
<TimelinePlacement v-else-if="item.type === 'placement'" :item="item"/>
|
<TimelinePlacement v-else-if="item.type === 'placement'" :item="item"/>
|
||||||
<TimelineRelatedTicket v-else-if="item.type === 'issue_relation'" :item="item"/>
|
<TimelineRelatedTicket v-else-if="item.type === 'issue_relation'" :item="item"/>
|
||||||
<TimelineCreated v-else-if="item.type === 'created'" :item="item"/>
|
|
||||||
<TimelineReturned v-else-if="item.type === 'returned'" :item="item"/>
|
|
||||||
<TimelineDeleted v-else-if="item.type === 'deleted'" :item="item"/>
|
|
||||||
<p v-else>{{ item }}</p>
|
<p v-else>{{ item }}</p>
|
||||||
</li>
|
</li>
|
||||||
<li class="timeline-item">
|
<li class="timeline-item">
|
||||||
|
@ -70,16 +58,10 @@ import TimelineShippingVoucher from "@/components/TimelineShippingVoucher.vue";
|
||||||
import AsyncButton from "@/components/inputs/AsyncButton.vue";
|
import AsyncButton from "@/components/inputs/AsyncButton.vue";
|
||||||
import TimelinePlacement from "@/components/TimelinePlacement.vue";
|
import TimelinePlacement from "@/components/TimelinePlacement.vue";
|
||||||
import TimelineRelatedTicket from "@/components/TimelineRelatedTicket.vue";
|
import TimelineRelatedTicket from "@/components/TimelineRelatedTicket.vue";
|
||||||
import TimelineCreated from "@/components/TimelineCreated.vue";
|
|
||||||
import TimelineReturned from "@/components/TimelineReturned.vue";
|
|
||||||
import TimelineDeleted from "@/components/TimelineDeleted.vue";
|
|
||||||
|
|
||||||
export default {
|
export default {
|
||||||
name: 'Timeline',
|
name: 'Timeline',
|
||||||
components: {
|
components: {
|
||||||
TimelineDeleted,
|
|
||||||
TimelineReturned,
|
|
||||||
TimelineCreated,
|
|
||||||
TimelineRelatedTicket,
|
TimelineRelatedTicket,
|
||||||
TimelinePlacement,
|
TimelinePlacement,
|
||||||
TimelineShippingVoucher,
|
TimelineShippingVoucher,
|
||||||
|
@ -221,4 +203,4 @@ a {
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
</style>
|
</style>
|
|
@ -1,83 +0,0 @@
|
||||||
<template>
|
|
||||||
<div class="timeline-item-description"><span>created by
|
|
||||||
<i class="avatar | small">
|
|
||||||
<font-awesome-icon icon="user"/>
|
|
||||||
</i>
|
|
||||||
<a href="#">$USER</a> at <time :datetime="timestamp">{{ timestamp }}</time></span>
|
|
||||||
</div>
|
|
||||||
</template>
|
|
||||||
|
|
||||||
<script>
|
|
||||||
|
|
||||||
import {mapState} from "vuex";
|
|
||||||
|
|
||||||
export default {
|
|
||||||
name: 'TimelineCreated',
|
|
||||||
props: {
|
|
||||||
'item': {
|
|
||||||
type: Object,
|
|
||||||
required: true
|
|
||||||
}
|
|
||||||
},
|
|
||||||
computed: {
|
|
||||||
'timestamp': function () {
|
|
||||||
return new Date(this.item.timestamp).toLocaleString();
|
|
||||||
},
|
|
||||||
|
|
||||||
}
|
|
||||||
};
|
|
||||||
</script>
|
|
||||||
|
|
||||||
<style scoped>
|
|
||||||
|
|
||||||
|
|
||||||
a {
|
|
||||||
color: inherit;
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
.timeline-item-description {
|
|
||||||
display: flex;
|
|
||||||
padding-top: 6px;
|
|
||||||
gap: 8px;
|
|
||||||
color: var(--gray);
|
|
||||||
|
|
||||||
img {
|
|
||||||
flex-shrink: 0;
|
|
||||||
}
|
|
||||||
|
|
||||||
a {
|
|
||||||
/*color: var(--c-grey-500);*/
|
|
||||||
font-weight: 500;
|
|
||||||
text-decoration: none;
|
|
||||||
|
|
||||||
&:hover,
|
|
||||||
&:focus {
|
|
||||||
outline: 0; /* Don't actually do this */
|
|
||||||
color: var(--info);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
.avatar {
|
|
||||||
display: inline-flex;
|
|
||||||
align-items: center;
|
|
||||||
justify-content: center;
|
|
||||||
border-radius: 50%;
|
|
||||||
overflow: hidden;
|
|
||||||
aspect-ratio: 1 / 1;
|
|
||||||
flex-shrink: 0;
|
|
||||||
width: 40px;
|
|
||||||
height: 40px;
|
|
||||||
|
|
||||||
&.small {
|
|
||||||
width: 28px;
|
|
||||||
height: 28px;
|
|
||||||
}
|
|
||||||
|
|
||||||
img {
|
|
||||||
object-fit: cover;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
</style>
|
|
|
@ -1,83 +0,0 @@
|
||||||
<template>
|
|
||||||
<div class="timeline-item-description"><span>marked deleted by
|
|
||||||
<i class="avatar | small">
|
|
||||||
<font-awesome-icon icon="user"/>
|
|
||||||
</i>
|
|
||||||
<a href="#">$USER</a> at <time :datetime="timestamp">{{ timestamp }}</time></span>
|
|
||||||
</div>
|
|
||||||
</template>
|
|
||||||
|
|
||||||
<script>
|
|
||||||
|
|
||||||
import {mapState} from "vuex";
|
|
||||||
|
|
||||||
export default {
|
|
||||||
name: 'TimelineDeleted',
|
|
||||||
props: {
|
|
||||||
'item': {
|
|
||||||
type: Object,
|
|
||||||
required: true
|
|
||||||
}
|
|
||||||
},
|
|
||||||
computed: {
|
|
||||||
'timestamp': function () {
|
|
||||||
return new Date(this.item.timestamp).toLocaleString();
|
|
||||||
},
|
|
||||||
|
|
||||||
}
|
|
||||||
};
|
|
||||||
</script>
|
|
||||||
|
|
||||||
<style scoped>
|
|
||||||
|
|
||||||
|
|
||||||
a {
|
|
||||||
color: inherit;
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
.timeline-item-description {
|
|
||||||
display: flex;
|
|
||||||
padding-top: 6px;
|
|
||||||
gap: 8px;
|
|
||||||
color: var(--gray);
|
|
||||||
|
|
||||||
img {
|
|
||||||
flex-shrink: 0;
|
|
||||||
}
|
|
||||||
|
|
||||||
a {
|
|
||||||
/*color: var(--c-grey-500);*/
|
|
||||||
font-weight: 500;
|
|
||||||
text-decoration: none;
|
|
||||||
|
|
||||||
&:hover,
|
|
||||||
&:focus {
|
|
||||||
outline: 0; /* Don't actually do this */
|
|
||||||
color: var(--info);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
.avatar {
|
|
||||||
display: inline-flex;
|
|
||||||
align-items: center;
|
|
||||||
justify-content: center;
|
|
||||||
border-radius: 50%;
|
|
||||||
overflow: hidden;
|
|
||||||
aspect-ratio: 1 / 1;
|
|
||||||
flex-shrink: 0;
|
|
||||||
width: 40px;
|
|
||||||
height: 40px;
|
|
||||||
|
|
||||||
&.small {
|
|
||||||
width: 28px;
|
|
||||||
height: 28px;
|
|
||||||
}
|
|
||||||
|
|
||||||
img {
|
|
||||||
object-fit: cover;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
</style>
|
|
|
@ -1,83 +0,0 @@
|
||||||
<template>
|
|
||||||
<div class="timeline-item-description"><span>marked returned by
|
|
||||||
<i class="avatar | small">
|
|
||||||
<font-awesome-icon icon="user"/>
|
|
||||||
</i>
|
|
||||||
<a href="#">$USER</a> at <time :datetime="timestamp">{{ timestamp }}</time></span>
|
|
||||||
</div>
|
|
||||||
</template>
|
|
||||||
|
|
||||||
<script>
|
|
||||||
|
|
||||||
import {mapState} from "vuex";
|
|
||||||
|
|
||||||
export default {
|
|
||||||
name: 'TimelineReturned',
|
|
||||||
props: {
|
|
||||||
'item': {
|
|
||||||
type: Object,
|
|
||||||
required: true
|
|
||||||
}
|
|
||||||
},
|
|
||||||
computed: {
|
|
||||||
'timestamp': function () {
|
|
||||||
return new Date(this.item.timestamp).toLocaleString();
|
|
||||||
},
|
|
||||||
|
|
||||||
}
|
|
||||||
};
|
|
||||||
</script>
|
|
||||||
|
|
||||||
<style scoped>
|
|
||||||
|
|
||||||
|
|
||||||
a {
|
|
||||||
color: inherit;
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
.timeline-item-description {
|
|
||||||
display: flex;
|
|
||||||
padding-top: 6px;
|
|
||||||
gap: 8px;
|
|
||||||
color: var(--gray);
|
|
||||||
|
|
||||||
img {
|
|
||||||
flex-shrink: 0;
|
|
||||||
}
|
|
||||||
|
|
||||||
a {
|
|
||||||
/*color: var(--c-grey-500);*/
|
|
||||||
font-weight: 500;
|
|
||||||
text-decoration: none;
|
|
||||||
|
|
||||||
&:hover,
|
|
||||||
&:focus {
|
|
||||||
outline: 0; /* Don't actually do this */
|
|
||||||
color: var(--info);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
.avatar {
|
|
||||||
display: inline-flex;
|
|
||||||
align-items: center;
|
|
||||||
justify-content: center;
|
|
||||||
border-radius: 50%;
|
|
||||||
overflow: hidden;
|
|
||||||
aspect-ratio: 1 / 1;
|
|
||||||
flex-shrink: 0;
|
|
||||||
width: 40px;
|
|
||||||
height: 40px;
|
|
||||||
|
|
||||||
&.small {
|
|
||||||
width: 28px;
|
|
||||||
height: 28px;
|
|
||||||
}
|
|
||||||
|
|
||||||
img {
|
|
||||||
object-fit: cover;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
</style>
|
|
|
@ -1,9 +1,9 @@
|
||||||
<template>
|
<template>
|
||||||
<button @click.stop="handleClick" :disabled="disabled || inProgress">
|
<button @click.stop="handleClick" :disabled="disabled">
|
||||||
<span class="spinner-border spinner-border-sm" role="status" aria-hidden="true"
|
<span class="spinner-border spinner-border-sm" role="status" aria-hidden="true"
|
||||||
:class="{'d-none': !inProgress}"></span>
|
:class="{'d-none': !disabled}"></span>
|
||||||
<span class="ml-2" :class="{'d-none': !inProgress}">In Progress...</span>
|
<span class="ml-2" :class="{'d-none': !disabled}">In Progress...</span>
|
||||||
<span :class="{'d-none': inProgress}"><slot></slot></span>
|
<span :class="{'d-none': disabled}"><slot></slot></span>
|
||||||
</button>
|
</button>
|
||||||
</template>
|
</template>
|
||||||
|
|
||||||
|
@ -13,7 +13,7 @@ export default {
|
||||||
name: 'AsyncButton',
|
name: 'AsyncButton',
|
||||||
data() {
|
data() {
|
||||||
return {
|
return {
|
||||||
inProgress: false,
|
disabled: false,
|
||||||
};
|
};
|
||||||
},
|
},
|
||||||
props: {
|
props: {
|
||||||
|
@ -21,21 +21,17 @@ export default {
|
||||||
type: Function,
|
type: Function,
|
||||||
required: true,
|
required: true,
|
||||||
},
|
},
|
||||||
disabled: {
|
|
||||||
type: Boolean,
|
|
||||||
required: false,
|
|
||||||
},
|
|
||||||
},
|
},
|
||||||
methods: {
|
methods: {
|
||||||
async handleClick() {
|
async handleClick() {
|
||||||
if (this.task && typeof this.task === 'function') {
|
if (this.task && typeof this.task === 'function') {
|
||||||
this.inProgress = true;
|
this.disabled = true;
|
||||||
try {
|
try {
|
||||||
await this.task();
|
await this.task();
|
||||||
} catch (e) {
|
} catch (e) {
|
||||||
console.error(e);
|
console.error(e);
|
||||||
} finally {
|
} finally {
|
||||||
this.inProgress = false;
|
this.disabled = false;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
|
@ -47,4 +43,4 @@ export default {
|
||||||
.spinner-border {
|
.spinner-border {
|
||||||
vertical-align: -0.125em;
|
vertical-align: -0.125em;
|
||||||
}
|
}
|
||||||
</style>
|
</style>
|
|
@ -61,6 +61,7 @@ const store = createStore({
|
||||||
'2kg-de': '2kg Paket (DE)',
|
'2kg-de': '2kg Paket (DE)',
|
||||||
'5kg-de': '5kg Paket (DE)',
|
'5kg-de': '5kg Paket (DE)',
|
||||||
'10kg-de': '10kg Paket (DE)',
|
'10kg-de': '10kg Paket (DE)',
|
||||||
|
'2kg-eu': '2kg Paket (EU)',
|
||||||
'5kg-eu': '5kg Paket (EU)',
|
'5kg-eu': '5kg Paket (EU)',
|
||||||
'10kg-eu': '10kg Paket (EU)',
|
'10kg-eu': '10kg Paket (EU)',
|
||||||
}
|
}
|
||||||
|
@ -76,26 +77,10 @@ const store = createStore({
|
||||||
getEventTickets: (state, getters) => getters.getEventSlug === 'all' ? getters.getAllTickets : getters.getAllTickets.filter(t => t.event === getters.getEventSlug || (t.event == null && getters.getEventSlug === 'none')),
|
getEventTickets: (state, getters) => getters.getEventSlug === 'all' ? getters.getAllTickets : getters.getAllTickets.filter(t => t.event === getters.getEventSlug || (t.event == null && getters.getEventSlug === 'none')),
|
||||||
isItemsLoaded: (state, getters) => (getters.getEventSlug === 'all' || getters.getEventSlug === 'none') ? !!state.loadedItems : Object.keys(state.loadedItems).includes(getters.getEventSlug),
|
isItemsLoaded: (state, getters) => (getters.getEventSlug === 'all' || getters.getEventSlug === 'none') ? !!state.loadedItems : Object.keys(state.loadedItems).includes(getters.getEventSlug),
|
||||||
isTicketsLoaded: (state, getters) => (getters.getEventSlug === 'all' || getters.getEventSlug === 'none') ? !!state.loadedTickets : Object.keys(state.loadedTickets).includes(getters.getEventSlug),
|
isTicketsLoaded: (state, getters) => (getters.getEventSlug === 'all' || getters.getEventSlug === 'none') ? !!state.loadedTickets : Object.keys(state.loadedTickets).includes(getters.getEventSlug),
|
||||||
getItemsSearchResults: (state, getters) => {
|
getItemsSearchResults: (state, getters) => state.loadedItemSearchResults[getters.getEventSlug + '/' + base64.encode(utf8.encode(getters.searchQuery))] || [],
|
||||||
if (getters.getEventSlug === 'all') {
|
getTicketsSearchResults: (state, getters) => state.loadedTicketSearchResults[getters.getEventSlug + '/' + base64.encode(utf8.encode(getters.searchQuery))] || [],
|
||||||
return state.events.map(e => {
|
isItemsSearchLoaded: (state, getters) => Object.keys(state.loadedItemSearchResults).includes(getters.getEventSlug + '/' + base64.encode(utf8.encode(getters.searchQuery))),
|
||||||
return state.loadedItemSearchResults[e.slug + '/' + base64.encode(utf8.encode(getters.searchQuery))] || []
|
isTicketsSearchLoaded: (state, getters) => Object.keys(state.loadedTicketSearchResults).includes(getters.getEventSlug + '/' + base64.encode(utf8.encode(getters.searchQuery))),
|
||||||
}).flat();
|
|
||||||
} else {
|
|
||||||
return state.loadedItemSearchResults[getters.getEventSlug + '/' + base64.encode(utf8.encode(getters.searchQuery))] || []
|
|
||||||
}
|
|
||||||
},
|
|
||||||
getTicketsSearchResults: (state, getters) => {
|
|
||||||
if (getters.getEventSlug === 'all') {
|
|
||||||
return state.events.map(e => {
|
|
||||||
return state.loadedTicketSearchResults[e.slug + '/' + base64.encode(utf8.encode(getters.searchQuery))] || []
|
|
||||||
}).flat();
|
|
||||||
} else {
|
|
||||||
return state.loadedTicketSearchResults[getters.getEventSlug + '/' + base64.encode(utf8.encode(getters.searchQuery))] || []
|
|
||||||
}
|
|
||||||
},
|
|
||||||
isItemsSearchLoaded: (state, getters) => Object.keys(state.loadedItemSearchResults).includes(getters.getEventSlug + '/' + base64.encode(utf8.encode(getters.searchQuery))) || getters.getEventSlug === 'all',
|
|
||||||
isTicketsSearchLoaded: (state, getters) => Object.keys(state.loadedTicketSearchResults).includes(getters.getEventSlug + '/' + base64.encode(utf8.encode(getters.searchQuery))) || getters.getEventSlug === 'all',
|
|
||||||
getActiveView: state => router.currentRoute.value.name || 'items',
|
getActiveView: state => router.currentRoute.value.name || 'items',
|
||||||
getFilters: state => router.currentRoute.value.query,
|
getFilters: state => router.currentRoute.value.query,
|
||||||
getBoxes: state => state.loadedBoxes,
|
getBoxes: state => state.loadedBoxes,
|
||||||
|
@ -394,39 +379,26 @@ const store = createStore({
|
||||||
},
|
},
|
||||||
async loadEventItems({commit, getters, state}) {
|
async loadEventItems({commit, getters, state}) {
|
||||||
if (!state.user.token) return;
|
if (!state.user.token) return;
|
||||||
const load = async (slug) => {
|
if (state.fetchedData.items > Date.now() - 1000 * 60 * 60 * 24) return;
|
||||||
try {
|
try {
|
||||||
const {data, success} = await getters.session.get(`/2/${slug}/items/`);
|
const slug = getters.getEventSlug;
|
||||||
if (data && success) {
|
const {data, success} = await getters.session.get(`/2/${slug}/items/`);
|
||||||
commit('setItems', {slug, items: data});
|
if (data && success) {
|
||||||
}
|
commit('setItems', {slug, items: data});
|
||||||
} catch (e) {
|
|
||||||
console.error("Error loading items");
|
|
||||||
}
|
}
|
||||||
}
|
} catch (e) {
|
||||||
const slug = getters.getEventSlug;
|
console.error("Error loading items");
|
||||||
if (slug === 'all') {
|
|
||||||
await Promise.all(state.events.map(e => load(e.slug)));
|
|
||||||
} else {
|
|
||||||
await load(slug);
|
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
async searchEventItems({commit, getters, state}, query) {
|
async searchEventItems({commit, getters, state}, query) {
|
||||||
const encoded_query = base64.encode(utf8.encode(query));
|
const encoded_query = base64.encode(utf8.encode(query));
|
||||||
const load = async (slug) => {
|
|
||||||
if (Object.keys(state.loadedItemSearchResults).includes(slug + '/' + encoded_query)) return;
|
|
||||||
const {
|
|
||||||
data, success
|
|
||||||
} = await getters.session.get(`/2/${slug}/items/${encoded_query}/`);
|
|
||||||
if (data && success) {
|
|
||||||
commit('setItemSearchResults', {slug, query: encoded_query, items: data});
|
|
||||||
}
|
|
||||||
}
|
|
||||||
const slug = getters.getEventSlug;
|
const slug = getters.getEventSlug;
|
||||||
if (slug === 'all') {
|
if (Object.keys(state.loadedItemSearchResults).includes(slug + '/' + encoded_query)) return;
|
||||||
await Promise.all(state.events.map(e => load(e.slug)));
|
const {
|
||||||
} else {
|
data, success
|
||||||
await load(slug);
|
} = await getters.session.get(`/2/${slug}/items/${encoded_query}/`);
|
||||||
|
if (data && success) {
|
||||||
|
commit('setItemSearchResults', {slug, query: encoded_query, items: data});
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
async loadBoxes({commit, state, getters}) {
|
async loadBoxes({commit, state, getters}) {
|
||||||
|
@ -474,19 +446,12 @@ const store = createStore({
|
||||||
},
|
},
|
||||||
async searchEventTickets({commit, getters, state}, query) {
|
async searchEventTickets({commit, getters, state}, query) {
|
||||||
const encoded_query = base64.encode(utf8.encode(query));
|
const encoded_query = base64.encode(utf8.encode(query));
|
||||||
const load = async (slug) => {
|
|
||||||
if (Object.keys(state.loadedTicketSearchResults).includes(slug + '/' + encoded_query)) return;
|
|
||||||
const {
|
|
||||||
data, success
|
|
||||||
} = await getters.session.get(`/2/${slug}/tickets/${encoded_query}/`);
|
|
||||||
if (data && success) commit('setTicketSearchResults', {slug, query: encoded_query, items: data});
|
|
||||||
}
|
|
||||||
const slug = getters.getEventSlug;
|
const slug = getters.getEventSlug;
|
||||||
if (slug === 'all') {
|
if (Object.keys(state.loadedTicketSearchResults).includes(slug + '/' + encoded_query)) return;
|
||||||
await Promise.all(state.events.map(e => load(e.slug)));
|
const {
|
||||||
} else {
|
data, success
|
||||||
await load(slug);
|
} = await getters.session.get(`/2/${slug}/tickets/${encoded_query}/`);
|
||||||
}
|
if (data && success) commit('setTicketSearchResults', {slug, query: encoded_query, items: data});
|
||||||
},
|
},
|
||||||
async sendMail({commit, dispatch, state, getters}, {id, message}) {
|
async sendMail({commit, dispatch, state, getters}, {id, message}) {
|
||||||
const {data, success} = await getters.session.post(`/2/tickets/${id}/reply/`, {message},
|
const {data, success} = await getters.session.post(`/2/tickets/${id}/reply/`, {message},
|
||||||
|
@ -563,14 +528,6 @@ const store = createStore({
|
||||||
state.fetchedData.tickets = 0;
|
state.fetchedData.tickets = 0;
|
||||||
await Promise.all([dispatch('loadTickets'), dispatch('fetchShippingVouchers')]);
|
await Promise.all([dispatch('loadTickets'), dispatch('fetchShippingVouchers')]);
|
||||||
}
|
}
|
||||||
},
|
|
||||||
async linkTicketItem({dispatch, state, getters}, {ticket_id, item_id}) {
|
|
||||||
const {data, success} = await getters.session.post(`/2/matches/`, {issue_thread: ticket_id, item: item_id});
|
|
||||||
if (data && success) {
|
|
||||||
state.fetchedData.tickets = 0;
|
|
||||||
state.fetchedData.items = 0;
|
|
||||||
await Promise.all([dispatch('loadTickets'), dispatch('loadEventItems')]);
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
plugins: [persistentStatePlugin({ // TODO change remember to some kind of enable field
|
plugins: [persistentStatePlugin({ // TODO change remember to some kind of enable field
|
||||||
|
|
|
@ -17,7 +17,7 @@
|
||||||
<textarea placeholder="add comment..." v-model="newComment"
|
<textarea placeholder="add comment..." v-model="newComment"
|
||||||
class="form-control">
|
class="form-control">
|
||||||
</textarea>
|
</textarea>
|
||||||
<AsyncButton class="btn btn-secondary float-right" :task="addCommentAndClear" :disabled="!newComment">
|
<AsyncButton class="btn btn-primary float-right" :task="addCommentAndClear">
|
||||||
<font-awesome-icon icon="comment"/>
|
<font-awesome-icon icon="comment"/>
|
||||||
Save Comment
|
Save Comment
|
||||||
</AsyncButton>
|
</AsyncButton>
|
||||||
|
@ -25,7 +25,7 @@
|
||||||
</div>
|
</div>
|
||||||
</template>
|
</template>
|
||||||
<template v-slot:timeline_action2>
|
<template v-slot:timeline_action2>
|
||||||
<span class="timeline-item-icon | filled-icon">
|
<span class="timeline-item-icon | faded-icon">
|
||||||
<font-awesome-icon icon="envelope"/>
|
<font-awesome-icon icon="envelope"/>
|
||||||
</span>
|
</span>
|
||||||
<div class="new-mail card bg-dark">
|
<div class="new-mail card bg-dark">
|
||||||
|
@ -35,7 +35,7 @@
|
||||||
<div>
|
<div>
|
||||||
<textarea placeholder="reply mail..." v-model="newMail" class="form-control">
|
<textarea placeholder="reply mail..." v-model="newMail" class="form-control">
|
||||||
</textarea>
|
</textarea>
|
||||||
<AsyncButton class="btn btn-primary float-right" :task="sendMailAndClear" :disabled="!newMail">
|
<AsyncButton class="btn btn-primary float-right" :task="sendMailAndClear">
|
||||||
<font-awesome-icon icon="envelope"/>
|
<font-awesome-icon icon="envelope"/>
|
||||||
Send Mail
|
Send Mail
|
||||||
</AsyncButton>
|
</AsyncButton>
|
||||||
|
@ -81,13 +81,6 @@
|
||||||
<font-awesome-icon icon="clipboard"/>
|
<font-awesome-icon icon="clipboard"/>
|
||||||
Copy DHL contact to clipboard
|
Copy DHL contact to clipboard
|
||||||
</ClipboardButton>
|
</ClipboardButton>
|
||||||
<div class="btn-group">
|
|
||||||
<input type="text" class="form-control" v-model="item_id">
|
|
||||||
<button class="form-control btn btn-success" :disabled="!item_id"
|
|
||||||
@click="linkTicketItem({ticket_id: ticket.id, item_id: parseInt(item_id)}).then(()=>item_id='')">
|
|
||||||
Link Item
|
|
||||||
</button>
|
|
||||||
</div>
|
|
||||||
<div class="btn-group">
|
<div class="btn-group">
|
||||||
<select class="form-control" v-model="shipping_voucher_type">
|
<select class="form-control" v-model="shipping_voucher_type">
|
||||||
<option v-for="type in availableShippingVoucherTypes.filter(t=>t.count>0)"
|
<option v-for="type in availableShippingVoucherTypes.filter(t=>t.count>0)"
|
||||||
|
@ -148,7 +141,6 @@ export default {
|
||||||
selected_state: null,
|
selected_state: null,
|
||||||
selected_assignee: null,
|
selected_assignee: null,
|
||||||
shipping_voucher_type: null,
|
shipping_voucher_type: null,
|
||||||
item_id: "",
|
|
||||||
newMail: "",
|
newMail: "",
|
||||||
newComment: ""
|
newComment: ""
|
||||||
}
|
}
|
||||||
|
@ -174,7 +166,6 @@ export default {
|
||||||
...mapActions(['deleteItem', 'markItemReturned', 'sendMail', 'updateTicketPartial', 'postComment']),
|
...mapActions(['deleteItem', 'markItemReturned', 'sendMail', 'updateTicketPartial', 'postComment']),
|
||||||
...mapActions(['loadTickets', 'fetchTicketStates', 'loadUsers', 'scheduleAfterInit']),
|
...mapActions(['loadTickets', 'fetchTicketStates', 'loadUsers', 'scheduleAfterInit']),
|
||||||
...mapActions(['claimShippingVoucher', 'fetchShippingVouchers']),
|
...mapActions(['claimShippingVoucher', 'fetchShippingVouchers']),
|
||||||
...mapActions(['linkTicketItem']),
|
|
||||||
...mapMutations(['openLightboxModalWith']),
|
...mapMutations(['openLightboxModalWith']),
|
||||||
changeTicketStatus() {
|
changeTicketStatus() {
|
||||||
this.ticket.state = this.selected_state;
|
this.ticket.state = this.selected_state;
|
||||||
|
@ -207,10 +198,10 @@ export default {
|
||||||
},
|
},
|
||||||
mounted() {
|
mounted() {
|
||||||
this.scheduleAfterInit(() => [Promise.all([this.fetchTicketStates(), this.loadTickets(), this.loadUsers(), this.fetchShippingVouchers()]).then(() => {
|
this.scheduleAfterInit(() => [Promise.all([this.fetchTicketStates(), this.loadTickets(), this.loadUsers(), this.fetchShippingVouchers()]).then(() => {
|
||||||
//if (this.ticket.state === "pending_new") {
|
if (this.ticket.state === "pending_new") {
|
||||||
// this.selected_state = "pending_open";
|
this.selected_state = "pending_open";
|
||||||
// this.changeTicketStatus()
|
this.changeTicketStatus()
|
||||||
//}
|
}
|
||||||
this.selected_state = this.ticket.state;
|
this.selected_state = this.ticket.state;
|
||||||
this.selected_assignee = this.ticket.assigned_to
|
this.selected_assignee = this.ticket.assigned_to
|
||||||
})]);
|
})]);
|
||||||
|
|
|
@ -25,7 +25,7 @@
|
||||||
:columns="['id', 'name', 'last_activity', 'assigned_to',
|
:columns="['id', 'name', 'last_activity', 'assigned_to',
|
||||||
...(getEventSlug==='all'?['event']:[])]"
|
...(getEventSlug==='all'?['event']:[])]"
|
||||||
:keyName="'state'" :sections="['pending_new', 'pending_open','pending_shipping',
|
:keyName="'state'" :sections="['pending_new', 'pending_open','pending_shipping',
|
||||||
'pending_physical_confirmation','pending_return','pending_postponed','pending_suspected_spam'].map(stateInfo)">
|
'pending_physical_confirmation','pending_return','pending_postponed'].map(stateInfo)">
|
||||||
<template #section_header="{index, section, count}">
|
<template #section_header="{index, section, count}">
|
||||||
{{ section.text }} <span class="badge badge-light ml-1">{{ count }}</span>
|
{{ section.text }} <span class="badge badge-light ml-1">{{ count }}</span>
|
||||||
</template>
|
</template>
|
||||||
|
|
|
@ -26,7 +26,7 @@
|
||||||
:columns="['id', 'name', 'last_activity', 'assigned_to',
|
:columns="['id', 'name', 'last_activity', 'assigned_to',
|
||||||
...(getEventSlug==='all'?['event']:[])]"
|
...(getEventSlug==='all'?['event']:[])]"
|
||||||
:keyName="'state'" :sections="['pending_new', 'pending_open','pending_shipping',
|
:keyName="'state'" :sections="['pending_new', 'pending_open','pending_shipping',
|
||||||
'pending_physical_confirmation','pending_return','pending_postponed','pending_suspected_spam'].map(stateInfo)">
|
'pending_physical_confirmation','pending_return','pending_postponed'].map(stateInfo)">
|
||||||
<template #section_header="{index, section, count}">
|
<template #section_header="{index, section, count}">
|
||||||
{{ section.text }} <span class="badge badge-light ml-1">{{ count }}</span>
|
{{ section.text }} <span class="badge badge-light ml-1">{{ count }}</span>
|
||||||
</template>
|
</template>
|
||||||
|
|
Loading…
Add table
Reference in a new issue