milestone/frontend (#63)
* Introduce separate server package * Implement basic visualization of solve response * Make fetch endpoint depend on environment * Introduce features flag for localhost cors support * Serve static files from './assets' directory * Add Dockerfile as example for server with frontend * Support multiple solving strategies * Support stable model semantics with nogoods * Introduce custom node type for nicer layout * Support more options and multiple models * Use standard example for adfs on the frontend * Use unoptimised hybrid step for better presentation * Upgrade frontend dependencies * Animate graph changes * Experiment with timeout on API endpoints * Relax CORS restrictions for local development * Add API for adding/deleting users; login; logout * Add API for uploading and solving adf problems * Add API for getting and updating user * Return early for parse and solve; Add Adf GET * Add Delete and Index endpoints for ADFs * Add basic UI for user endpoints * Enforce username and password to be set on login * Show colored snackbars * Allow file upload for ADF; fix some server bugs * Implement ADF Add Form and Overview * Add Detail View for ADF problems * Add docker-compose file for mongodb (development) * Add mongodb (DEV) data directory to dockerignore * Let unknown routes be handled by frontend * Add legal information page to frontend * Change G6 Graph layout slightly * Add missing doc comments to lib * Update legal information regarding cookies * Add project logos to frontend * Add help texts to frontend * Move DoubleLabeledGraph from lib to server * Give example for custom Adf datastructure in docs * Update README and Project Website * Update devskim.yml * Add READMEs for frontend and server --------- Co-authored-by: monsterkrampe <monsterkrampe@users.noreply.github.com>
5
.dockerignore
Normal file
@ -0,0 +1,5 @@
|
||||
frontend/node_modules
|
||||
frontend/.parcel-cache
|
||||
server/mongodb-data
|
||||
target/debug
|
||||
|
||||
12
.github/workflows/devskim.yml
vendored
@ -7,11 +7,11 @@ name: DevSkim
|
||||
|
||||
on:
|
||||
push:
|
||||
branches: [ main ]
|
||||
branches: [ "main" ]
|
||||
pull_request:
|
||||
branches: [ main ]
|
||||
branches: [ "main" ]
|
||||
schedule:
|
||||
- cron: '26 6 * * 5'
|
||||
- cron: '15 6 * * 4'
|
||||
|
||||
jobs:
|
||||
lint:
|
||||
@ -23,12 +23,12 @@ jobs:
|
||||
security-events: write
|
||||
steps:
|
||||
- name: Checkout code
|
||||
uses: actions/checkout@v2
|
||||
uses: actions/checkout@v3
|
||||
|
||||
- name: Run DevSkim scanner
|
||||
uses: microsoft/DevSkim-Action@v1
|
||||
|
||||
|
||||
- name: Upload DevSkim scan results to GitHub Security tab
|
||||
uses: github/codeql-action/upload-sarif@v1
|
||||
uses: github/codeql-action/upload-sarif@v2
|
||||
with:
|
||||
sarif_file: devskim-results.sarif
|
||||
|
||||
2566
Cargo.lock
generated
@ -1,7 +1,7 @@
|
||||
[workspace]
|
||||
members=[ "lib", "bin" ]
|
||||
members=[ "lib", "bin", "server" ]
|
||||
default-members = [ "lib" ]
|
||||
|
||||
[profile.release]
|
||||
lto = "fat"
|
||||
codegen-units = 1
|
||||
codegen-units = 1
|
||||
|
||||
36
Dockerfile
Normal file
@ -0,0 +1,36 @@
|
||||
# 1. BUILD-CONTAINER: Frontend
|
||||
FROM node:hydrogen-alpine
|
||||
|
||||
WORKDIR /root
|
||||
|
||||
COPY ./frontend /root
|
||||
|
||||
RUN yarn && yarn build
|
||||
|
||||
# 2. BUILD-CONTAINER: Server
|
||||
FROM rust:alpine
|
||||
|
||||
WORKDIR /root
|
||||
|
||||
RUN apk add --no-cache musl-dev
|
||||
|
||||
COPY ./bin /root/bin
|
||||
COPY ./lib /root/lib
|
||||
COPY ./server /root/server
|
||||
COPY ./Cargo.toml /root/Cargo.toml
|
||||
COPY ./Cargo.lock /root/Cargo.lock
|
||||
|
||||
RUN cargo build --workspace --release
|
||||
|
||||
# 3. RUNTIME-CONTAINER: run server with frontend as assets
|
||||
FROM alpine:latest
|
||||
|
||||
WORKDIR /root
|
||||
|
||||
COPY --from=0 /root/dist /root/assets
|
||||
COPY --from=1 /root/target/release/adf-bdd-server /root/server
|
||||
|
||||
EXPOSE 8080
|
||||
|
||||
ENTRYPOINT ["./server"]
|
||||
|
||||
@ -11,13 +11,15 @@
|
||||
|
||||
# Abstract Dialectical Frameworks solved by (ordered) Binary Decision Diagrams; developed in Dresden (ADF-oBDD project)
|
||||
|
||||
This project is currently split into two parts:
|
||||
This project is currently split into three parts:
|
||||
- a [binary (adf-bdd)](bin), which allows one to easily answer semantics questions on abstract dialectical frameworks
|
||||
- a [library (adf_bdd)](lib), which contains all the necessary algorithms and an open API which compute the answers to the semantics questions
|
||||
- a [server](server) and a [frontend](frontend) to access the solver as a web-service available at https://adf-bdd.dev
|
||||
|
||||
Latest documentation of the API can be found [here](https://docs.rs/adf_bdd/latest/adf_bdd/).
|
||||
The current version of the binary can be downloaded [here](https://github.com/ellmau/adf-obdd/releases).
|
||||
|
||||
|
||||
Do not hesitate to report bugs or ask about features in the [issues-section](https://github.com/ellmau/adf-obdd/issues) or have a conversation about anything of the project in the [discussion space](https://github.com/ellmau/adf-obdd/discussions)
|
||||
|
||||
|
||||
|
||||
@ -8,8 +8,8 @@
|
||||

|
||||
[](https://github.com/ellmau/adf-obdd/discussions) 
|
||||
|
||||
| [Home](index.md) | [Binary](adf-bdd.md) | [Library](adf_bdd.md)| [Repository](https://github.com/ellmau/adf-obdd) |
|
||||
|--- | --- | --- | --- |
|
||||
| [Home](index.md) | [Binary](adf-bdd.md) | [Library](adf_bdd.md)| [Web-Service](https://adf-bdd.dev) | [Repository](https://github.com/ellmau/adf-obdd) |
|
||||
|--- | --- | --- | --- | --- |
|
||||
|
||||
# Abstract Dialectical Frameworks solved by Binary Decision Diagrams; developed in Dresden (ADF-BDD)
|
||||
This is the readme for the executable solver.
|
||||
|
||||
@ -8,8 +8,8 @@
|
||||

|
||||
[](https://github.com/ellmau/adf-obdd/discussions) 
|
||||
|
||||
| [Home](index.md) | [Binary](adf-bdd.md) | [Library](adf_bdd.md)| [Repository](https://github.com/ellmau/adf-obdd) |
|
||||
|--- | --- | --- | --- |
|
||||
| [Home](index.md) | [Binary](adf-bdd.md) | [Library](adf_bdd.md)| [Web-Service](https://adf-bdd.dev) | [Repository](https://github.com/ellmau/adf-obdd) |
|
||||
|--- | --- | --- | --- | --- |
|
||||
|
||||
# Abstract Dialectical Frameworks solved by Binary Decision Diagrams; developed in Dresden (ADF_BDD)
|
||||
This library contains an efficient representation of Abstract Dialectical Frameworks (ADf) by utilising an implementation of Ordered Binary Decision Diagrams (OBDD)
|
||||
|
||||
@ -8,14 +8,16 @@
|
||||

|
||||
[](https://github.com/ellmau/adf-obdd/discussions) 
|
||||
|
||||
| [Home](index.md) | [Binary](adf-bdd.md) | [Library](adf_bdd.md)| [Repository](https://github.com/ellmau/adf-obdd) |
|
||||
|--- | --- | --- | --- |
|
||||
| [Home](index.md) | [Binary](adf-bdd.md) | [Library](adf_bdd.md)| [Web-Service](https://adf-bdd.dev) | [Repository](https://github.com/ellmau/adf-obdd) |
|
||||
|--- | --- | --- | --- | --- |
|
||||
|
||||
# Abstract Dialectical Frameworks solved by (ordered) Binary Decision Diagrams; developed in Dresden (ADF-oBDD project)
|
||||
|
||||
This project is currently split into two parts:
|
||||
|
||||
This project is currently split into three parts:
|
||||
- a [binary (adf-bdd)](adf-bdd.md), which allows one to easily answer semantics questions on abstract dialectical frameworks
|
||||
- a [library (adf_bdd)](adf_bdd.md), which contains all the necessary algorithms and an open API which compute the answers to the semantics questions
|
||||
- a server and a frontend, available at https://adf-bdd.dev
|
||||
|
||||
Latest documentation of the API can be found [here](https://docs.rs/adf_bdd/latest/adf_bdd/).
|
||||
The current version of the binary can be downloaded [here](https://github.com/ellmau/adf-obdd/releases).
|
||||
|
||||
24
flake.lock
generated
@ -2,11 +2,11 @@
|
||||
"nodes": {
|
||||
"flake-utils": {
|
||||
"locked": {
|
||||
"lastModified": 1667395993,
|
||||
"narHash": "sha256-nuEHfE/LcWyuSWnS8t12N1wc105Qtau+/OdUAjtQ0rA=",
|
||||
"lastModified": 1678901627,
|
||||
"narHash": "sha256-U02riOqrKKzwjsxc/400XnElV+UtPUQWpANPlyazjH0=",
|
||||
"owner": "numtide",
|
||||
"repo": "flake-utils",
|
||||
"rev": "5aed5285a952e0b949eb3ba02c12fa4fcfef535f",
|
||||
"rev": "93a2b84fc4b70d9e089d029deacc3583435c2ed6",
|
||||
"type": "github"
|
||||
},
|
||||
"original": {
|
||||
@ -33,11 +33,11 @@
|
||||
},
|
||||
"nixpkgs": {
|
||||
"locked": {
|
||||
"lastModified": 1672441588,
|
||||
"narHash": "sha256-jx5kxOyeObnVD44HRebKYL3cjWrcKhhcDmEYm0/naDY=",
|
||||
"lastModified": 1679966490,
|
||||
"narHash": "sha256-k0jV+y1jawE6w4ZvKgXDNg4+O9NNtcaWwzw8gufv0b4=",
|
||||
"owner": "NixOS",
|
||||
"repo": "nixpkgs",
|
||||
"rev": "6a0d2701705c3cf6f42c15aa92b7885f1f8a477f",
|
||||
"rev": "5b7cd5c39befee629be284970415b6eb3b0ff000",
|
||||
"type": "github"
|
||||
},
|
||||
"original": {
|
||||
@ -49,11 +49,11 @@
|
||||
},
|
||||
"nixpkgs-unstable": {
|
||||
"locked": {
|
||||
"lastModified": 1672350804,
|
||||
"narHash": "sha256-jo6zkiCabUBn3ObuKXHGqqORUMH27gYDIFFfLq5P4wg=",
|
||||
"lastModified": 1679944645,
|
||||
"narHash": "sha256-e5Qyoe11UZjVfgRfwNoSU57ZeKuEmjYb77B9IVW7L/M=",
|
||||
"owner": "NixOS",
|
||||
"repo": "nixpkgs",
|
||||
"rev": "677ed08a50931e38382dbef01cba08a8f7eac8f6",
|
||||
"rev": "4bb072f0a8b267613c127684e099a70e1f6ff106",
|
||||
"type": "github"
|
||||
},
|
||||
"original": {
|
||||
@ -82,11 +82,11 @@
|
||||
]
|
||||
},
|
||||
"locked": {
|
||||
"lastModified": 1672626196,
|
||||
"narHash": "sha256-BfdLrMqxqa4YA1I1wgPBQyu4FPzL0Tp4WI2C5S6BuYo=",
|
||||
"lastModified": 1680056830,
|
||||
"narHash": "sha256-WB4KD8oLSxAAtmXYSzwVwQusC2Gy5vTEln1uTt0GI2k=",
|
||||
"owner": "oxalica",
|
||||
"repo": "rust-overlay",
|
||||
"rev": "c8bf9c162bb3f734cf357846e995eb70b94e2bcd",
|
||||
"rev": "c8d8d05b8100d451243b614d950fa3f966c1fcc2",
|
||||
"type": "github"
|
||||
},
|
||||
"original": {
|
||||
|
||||
13
frontend/.editorconfig
Normal file
@ -0,0 +1,13 @@
|
||||
root = true
|
||||
|
||||
[*]
|
||||
end_of_line = lf
|
||||
insert_final_newline = true
|
||||
|
||||
[*.{ts,tsx}]
|
||||
indent_style = space
|
||||
indent_size = 2
|
||||
|
||||
[package.json]
|
||||
indent_style = space
|
||||
indent_size = 2
|
||||
27
frontend/.eslintrc.js
Normal file
@ -0,0 +1,27 @@
|
||||
module.exports = {
|
||||
"env": {
|
||||
"browser": true,
|
||||
"es2021": true
|
||||
},
|
||||
"extends": [
|
||||
"plugin:react/recommended",
|
||||
"airbnb",
|
||||
"airbnb-typescript",
|
||||
],
|
||||
"parser": "@typescript-eslint/parser",
|
||||
"parserOptions": {
|
||||
"ecmaFeatures": {
|
||||
"jsx": true
|
||||
},
|
||||
"ecmaVersion": "latest",
|
||||
"sourceType": "module",
|
||||
"project": "tsconfig.json"
|
||||
},
|
||||
"plugins": [
|
||||
"react",
|
||||
"@typescript-eslint"
|
||||
],
|
||||
"rules": {
|
||||
"react/jsx-filename-extension": [1, { "extensions": [".tsx"] }]
|
||||
}
|
||||
}
|
||||
5
frontend/.gitignore
vendored
Normal file
@ -0,0 +1,5 @@
|
||||
node_modules
|
||||
dist
|
||||
.parcel-cache
|
||||
yarn-error.log
|
||||
|
||||
13
frontend/README.md
Normal file
@ -0,0 +1,13 @@
|
||||
# Frontend for Webservice
|
||||
|
||||
This directory contains the (standalone) frontend for <https://adf-bdd.dev> built using React, Material UI and Typescript.
|
||||
|
||||
## Usage
|
||||
|
||||
For local development run:
|
||||
|
||||
- `yarn install` to install the dependencies
|
||||
- `yarn run check` to run typechecks and the linter (eslint)
|
||||
- `yarn start` to start the development server listening on `localhost:1234`
|
||||
|
||||
The frontend tries to connect to the server at `localhost:8080` in development mode.
|
||||
5
frontend/index.d.ts
vendored
Normal file
@ -0,0 +1,5 @@
|
||||
declare module 'bundle-text:*' {
|
||||
const s: string
|
||||
export default s
|
||||
}
|
||||
|
||||
40
frontend/package.json
Normal file
@ -0,0 +1,40 @@
|
||||
{
|
||||
"name": "ADF-OBDD-Frontend",
|
||||
"version": "0.1.0",
|
||||
"source": "src/index.html",
|
||||
"browserslist": "> 0.5%, last 2 versions, not dead",
|
||||
"scripts": {
|
||||
"check": "tsc --noEmit && eslint ./src",
|
||||
"start": "parcel",
|
||||
"build": "parcel build"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@parcel/transformer-inline-string": "2.8.2",
|
||||
"@types/node": "^18.15.11",
|
||||
"@types/react": "^18.0.26",
|
||||
"@types/react-dom": "^18.0.10",
|
||||
"@typescript-eslint/eslint-plugin": "^5.48.1",
|
||||
"@typescript-eslint/parser": "^5.48.1",
|
||||
"eslint": "^8.31.0",
|
||||
"eslint-config-airbnb": "^19.0.4",
|
||||
"eslint-config-airbnb-typescript": "^17.0.0",
|
||||
"eslint-plugin-import": "^2.27.4",
|
||||
"eslint-plugin-jsx-a11y": "^6.7.1",
|
||||
"eslint-plugin-react": "^7.32.0",
|
||||
"parcel": "^2.8.2",
|
||||
"process": "^0.11.10",
|
||||
"typescript": "^4.9.4"
|
||||
},
|
||||
"dependencies": {
|
||||
"@antv/g6": "^4.8.3",
|
||||
"@emotion/react": "^11.10.5",
|
||||
"@emotion/styled": "^11.10.5",
|
||||
"@fontsource/roboto": "^4.5.8",
|
||||
"@mui/icons-material": "^5.11.16",
|
||||
"@mui/material": "^5.11.4",
|
||||
"markdown-to-jsx": "^7.2.0",
|
||||
"react": "^18.2.0",
|
||||
"react-dom": "^18.2.0",
|
||||
"react-router-dom": "^6.10.0"
|
||||
}
|
||||
}
|
||||
8
frontend/shell.nix
Normal file
@ -0,0 +1,8 @@
|
||||
{ pkgs ? import <nixpkgs> {} }:
|
||||
|
||||
pkgs.mkShell {
|
||||
buildInputs = [
|
||||
pkgs.yarn
|
||||
];
|
||||
}
|
||||
|
||||
13
frontend/src/app.tsx
Normal file
@ -0,0 +1,13 @@
|
||||
import * as React from 'react';
|
||||
import { createRoot } from 'react-dom/client';
|
||||
|
||||
import '@fontsource/roboto/300.css';
|
||||
import '@fontsource/roboto/400.css';
|
||||
import '@fontsource/roboto/500.css';
|
||||
import '@fontsource/roboto/700.css';
|
||||
|
||||
import App from './components/app';
|
||||
|
||||
const container = document.getElementById('app');
|
||||
const root = createRoot(container!);
|
||||
root.render(<App />);
|
||||
371
frontend/src/components/adf-details.tsx
Normal file
@ -0,0 +1,371 @@
|
||||
import React, {
|
||||
useState, useContext, useEffect, useCallback, useRef,
|
||||
} from 'react';
|
||||
import { useParams, useNavigate } from 'react-router-dom';
|
||||
import {
|
||||
Accordion,
|
||||
AccordionDetails,
|
||||
AccordionSummary,
|
||||
Alert,
|
||||
AlertColor,
|
||||
Button,
|
||||
Chip,
|
||||
Container,
|
||||
Grid,
|
||||
Paper,
|
||||
Pagination,
|
||||
Skeleton,
|
||||
Stack,
|
||||
Tabs,
|
||||
Tab,
|
||||
TextField,
|
||||
Typography,
|
||||
} from '@mui/material';
|
||||
|
||||
import ExpandMoreIcon from '@mui/icons-material/ExpandMore';
|
||||
|
||||
import DetailInfoMd from 'bundle-text:../help-texts/detail-info.md';
|
||||
import Markdown from './markdown';
|
||||
|
||||
import GraphG6, { GraphProps } from './graph-g6';
|
||||
import LoadingContext from './loading-context';
|
||||
import SnackbarContext from './snackbar-context';
|
||||
|
||||
export type Parsing = 'Naive' | 'Hybrid';
|
||||
|
||||
export type StrategySnakeCase = 'parse_only' | 'ground' | 'complete' | 'stable' | 'stable_counting_a' | 'stable_counting_b' | 'stable_nogood';
|
||||
|
||||
export type StrategyCamelCase = 'ParseOnly' | 'Ground' | 'Complete' | 'Stable' | 'StableCountingA' | 'StableCountingB' | 'StableNogood';
|
||||
export const STRATEGIES_WITHOUT_PARSE: StrategyCamelCase[] = ['Ground', 'Complete', 'Stable', 'StableCountingA', 'StableCountingB', 'StableNogood'];
|
||||
|
||||
export interface AcAndGraph {
|
||||
ac: string[],
|
||||
graph: GraphProps,
|
||||
}
|
||||
|
||||
export type AcsWithGraphsOpt = {
|
||||
type: 'None',
|
||||
} | {
|
||||
type: 'Error',
|
||||
content: string
|
||||
} | {
|
||||
type: 'Some',
|
||||
content: AcAndGraph[]
|
||||
};
|
||||
|
||||
export type Task = {
|
||||
type: 'Parse',
|
||||
} | {
|
||||
type: 'Solve',
|
||||
content: StrategyCamelCase,
|
||||
};
|
||||
|
||||
export interface AdfProblemInfo {
|
||||
name: string,
|
||||
code: string,
|
||||
parsing_used: Parsing,
|
||||
// NOTE: the keys are really only strategies
|
||||
acs_per_strategy: { [key in StrategySnakeCase]: AcsWithGraphsOpt },
|
||||
running_tasks: Task[],
|
||||
}
|
||||
|
||||
export function acsWithGraphOptToColor(status: AcsWithGraphsOpt, running: boolean): AlertColor {
|
||||
if (running) {
|
||||
return 'warning';
|
||||
}
|
||||
|
||||
switch (status.type) {
|
||||
case 'None': return 'info';
|
||||
case 'Error': return 'error';
|
||||
case 'Some': return 'success';
|
||||
default:
|
||||
throw new Error('Unknown type union variant (cannot occur)');
|
||||
}
|
||||
}
|
||||
|
||||
export function acsWithGraphOptToText(status: AcsWithGraphsOpt, running: boolean): string {
|
||||
if (running) {
|
||||
return 'Running';
|
||||
}
|
||||
|
||||
switch (status.type) {
|
||||
case 'None': return 'Not attempted';
|
||||
case 'Error': return 'Failed';
|
||||
case 'Some': return 'Done';
|
||||
default:
|
||||
throw new Error('Unknown type union variant (cannot occur)');
|
||||
}
|
||||
}
|
||||
|
||||
function AdfDetails() {
|
||||
const { adfName } = useParams();
|
||||
const navigate = useNavigate();
|
||||
|
||||
const { setLoading } = useContext(LoadingContext);
|
||||
const { status: snackbarInfo, setStatus: setSnackbarInfo } = useContext(SnackbarContext);
|
||||
const [problem, setProblem] = useState<AdfProblemInfo>();
|
||||
const [tab, setTab] = useState<StrategySnakeCase>('parse_only');
|
||||
const [solutionIndex, setSolutionIndex] = useState<number>(0);
|
||||
|
||||
const isFirstRender = useRef(true);
|
||||
|
||||
const fetchProblem = useCallback(
|
||||
() => {
|
||||
fetch(`${process.env.NODE_ENV === 'development' ? '//localhost:8080' : ''}/adf/${adfName}`, {
|
||||
method: 'GET',
|
||||
credentials: process.env.NODE_ENV === 'development' ? 'include' : 'same-origin',
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
},
|
||||
})
|
||||
.then((res) => {
|
||||
switch (res.status) {
|
||||
case 200:
|
||||
res.json().then((resProblem) => {
|
||||
setProblem(resProblem);
|
||||
});
|
||||
break;
|
||||
default:
|
||||
navigate('/');
|
||||
break;
|
||||
}
|
||||
});
|
||||
},
|
||||
[setProblem],
|
||||
);
|
||||
|
||||
const solveHandler = useCallback(
|
||||
(strategy: StrategyCamelCase) => {
|
||||
setLoading(true);
|
||||
|
||||
fetch(`${process.env.NODE_ENV === 'development' ? '//localhost:8080' : ''}/adf/${adfName}/solve`, {
|
||||
method: 'PUT',
|
||||
credentials: process.env.NODE_ENV === 'development' ? 'include' : 'same-origin',
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
},
|
||||
body: JSON.stringify({ strategy }),
|
||||
})
|
||||
.then((res) => {
|
||||
switch (res.status) {
|
||||
case 200:
|
||||
setSnackbarInfo({ message: 'Solving problem now...', severity: 'success', potentialUserChange: false });
|
||||
fetchProblem();
|
||||
break;
|
||||
default:
|
||||
setSnackbarInfo({ message: 'Something went wrong tying to solve the problem.', severity: 'error', potentialUserChange: false });
|
||||
break;
|
||||
}
|
||||
})
|
||||
.finally(() => setLoading(false));
|
||||
},
|
||||
[adfName],
|
||||
);
|
||||
|
||||
const deleteHandler = useCallback(
|
||||
() => {
|
||||
setLoading(true);
|
||||
|
||||
fetch(`${process.env.NODE_ENV === 'development' ? '//localhost:8080' : ''}/adf/${adfName}`, {
|
||||
method: 'DELETE',
|
||||
credentials: process.env.NODE_ENV === 'development' ? 'include' : 'same-origin',
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
},
|
||||
})
|
||||
.then((res) => {
|
||||
switch (res.status) {
|
||||
case 200:
|
||||
setSnackbarInfo({ message: 'ADF Problem deleted.', severity: 'success', potentialUserChange: false });
|
||||
navigate('/');
|
||||
break;
|
||||
default:
|
||||
break;
|
||||
}
|
||||
})
|
||||
.finally(() => setLoading(false));
|
||||
},
|
||||
[adfName],
|
||||
);
|
||||
|
||||
useEffect(
|
||||
() => {
|
||||
// TODO: having the info if the user may have changed on the snackbar info
|
||||
// is a bit lazy and unclean; be better!
|
||||
if (isFirstRender.current || snackbarInfo?.potentialUserChange) {
|
||||
isFirstRender.current = false;
|
||||
|
||||
fetchProblem();
|
||||
}
|
||||
},
|
||||
[snackbarInfo?.potentialUserChange],
|
||||
);
|
||||
|
||||
useEffect(
|
||||
() => {
|
||||
// if there is a running task, fetch problems again after 20 seconds
|
||||
let timeout: ReturnType<typeof setTimeout>;
|
||||
if (problem && problem.running_tasks.length > 0) {
|
||||
timeout = setTimeout(() => fetchProblem(), 20000);
|
||||
}
|
||||
|
||||
return () => {
|
||||
if (timeout) {
|
||||
clearTimeout(timeout);
|
||||
}
|
||||
};
|
||||
},
|
||||
[problem],
|
||||
);
|
||||
|
||||
const acsOpt = problem?.acs_per_strategy[tab];
|
||||
const acsContent = acsOpt?.type === 'Some' ? acsOpt.content : undefined;
|
||||
const tabCamelCase: StrategyCamelCase = tab.replace(/^([a-z])/, (_, p1) => p1.toUpperCase()).replace(/_([a-z])/g, (_, p1) => `${p1.toUpperCase()}`) as StrategyCamelCase;
|
||||
|
||||
return (
|
||||
<>
|
||||
<Typography variant="h3" component="h1" align="center" gutterBottom>
|
||||
ADF-BDD.DEV
|
||||
</Typography>
|
||||
<Container sx={{ marginTop: 2, marginBottom: 2 }}>
|
||||
<Accordion>
|
||||
<AccordionSummary expandIcon={<ExpandMoreIcon />}>
|
||||
<span style={{ fontWeight: 'bold' }}>What can I do with the ADF now?</span>
|
||||
</AccordionSummary>
|
||||
<AccordionDetails>
|
||||
<Grid container alignItems="center" spacing={2}>
|
||||
<Grid item xs={12} sm={8}>
|
||||
<Markdown>{DetailInfoMd}</Markdown>
|
||||
</Grid>
|
||||
<Grid item xs={12} sm={4}>
|
||||
<img
|
||||
src={new URL('../help-texts/example-bdd.png', import.meta.url).toString()}
|
||||
alt="Example BDD"
|
||||
style={{ maxWidth: '100%', borderRadius: 4, boxShadow: '0 0 5px 0 rgba(0,0,0,0.4)' }}
|
||||
/>
|
||||
</Grid>
|
||||
</Grid>
|
||||
</AccordionDetails>
|
||||
</Accordion>
|
||||
</Container>
|
||||
<Container sx={{ marginBottom: 4 }}>
|
||||
{problem ? (
|
||||
<>
|
||||
<Paper elevation={8} sx={{ padding: 2, marginBottom: 2 }}>
|
||||
<Stack direction="row" justifyContent="space-between" sx={{ marginBottom: 1 }}>
|
||||
<Button
|
||||
variant="outlined"
|
||||
color="info"
|
||||
onClick={() => { navigate('/'); }}
|
||||
>
|
||||
Back
|
||||
</Button>
|
||||
<Typography variant="h4" component="h2" align="center" gutterBottom>
|
||||
{problem.name}
|
||||
</Typography>
|
||||
<Button
|
||||
type="button"
|
||||
variant="outlined"
|
||||
color="error"
|
||||
onClick={() => {
|
||||
// eslint-disable-next-line no-alert
|
||||
if (window.confirm('Are you sure that you want to delete this ADF problem?')) {
|
||||
deleteHandler();
|
||||
}
|
||||
}}
|
||||
>
|
||||
Delete
|
||||
</Button>
|
||||
</Stack>
|
||||
<TextField
|
||||
name="code"
|
||||
label="Code"
|
||||
helperText="Click here to copy!"
|
||||
multiline
|
||||
maxRows={5}
|
||||
fullWidth
|
||||
variant="filled"
|
||||
value={problem.code.trim()}
|
||||
disabled
|
||||
sx={{ cursor: 'pointer' }}
|
||||
onClick={() => { navigator.clipboard.writeText(problem.code); setSnackbarInfo({ message: 'Code copied to clipboard!', severity: 'info', potentialUserChange: false }); }}
|
||||
/>
|
||||
</Paper>
|
||||
<Tabs
|
||||
value={tab}
|
||||
onChange={(_e, newTab) => { setTab(newTab); setSolutionIndex(0); }}
|
||||
variant="scrollable"
|
||||
scrollButtons="auto"
|
||||
>
|
||||
<Tab wrapped value="parse_only" label={<Chip color={acsWithGraphOptToColor(problem.acs_per_strategy.parse_only, problem.running_tasks.some((t: Task) => t.type === 'Parse'))} label={`${problem.parsing_used} Parsing`} sx={{ cursor: 'inherit' }} />} />
|
||||
{STRATEGIES_WITHOUT_PARSE.map((strategy) => {
|
||||
const spaced = strategy.replace(/([A-Za-z])([A-Z])/g, '$1 $2');
|
||||
const snakeCase = strategy.replace(/^([A-Z])/, (_, p1) => p1.toLowerCase()).replace(/([A-Z])/g, (_, p1) => `_${p1.toLowerCase()}`) as StrategySnakeCase;
|
||||
const status = problem.acs_per_strategy[snakeCase];
|
||||
|
||||
const running = problem.running_tasks.some((t: Task) => t.type === 'Solve' && t.content === strategy);
|
||||
|
||||
const color = acsWithGraphOptToColor(status, running);
|
||||
|
||||
return <Tab key={strategy} wrapped value={snakeCase} label={<Chip color={color} label={spaced} sx={{ cursor: 'inherit' }} />} />;
|
||||
})}
|
||||
</Tabs>
|
||||
|
||||
{acsContent && acsContent.length > 1 && (
|
||||
<>
|
||||
Models:
|
||||
<br />
|
||||
<Pagination variant="outlined" shape="rounded" count={acsContent.length} page={solutionIndex + 1} onChange={(_e, newIdx) => setSolutionIndex(newIdx - 1)} />
|
||||
</>
|
||||
)}
|
||||
<Paper elevation={3} square sx={{ padding: 2, marginTop: 4, marginBottom: 4 }}>
|
||||
{problem.running_tasks.some((t: Task) => (tab === 'parse_only' && t.type === 'Parse') || (t.type === 'Solve' && t.content === tabCamelCase)) ? (
|
||||
<Alert severity="warning">Working hard to solve the problem right now...</Alert>
|
||||
) : (
|
||||
<>
|
||||
{acsContent && acsContent.length > 0 && (
|
||||
<GraphG6 graph={acsContent[solutionIndex].graph} />
|
||||
)}
|
||||
{acsContent && acsContent.length === 0 && (
|
||||
<Alert severity="info">The problem has no models for this strategy.</Alert>
|
||||
)}
|
||||
{!acsContent && acsOpt?.type === 'Error' && (
|
||||
<Alert severity="error">
|
||||
An error occurred:
|
||||
{acsOpt.content}
|
||||
</Alert>
|
||||
)}
|
||||
{!acsContent && acsOpt?.type === 'None' && (
|
||||
<>
|
||||
<Alert severity="info" sx={{ marginBottom: 1 }}>This strategy was not attempted yet.</Alert>
|
||||
<Button
|
||||
variant="contained"
|
||||
size="large"
|
||||
color="warning"
|
||||
onClick={() => {
|
||||
solveHandler(tabCamelCase);
|
||||
}}
|
||||
>
|
||||
Solve now!
|
||||
</Button>
|
||||
</>
|
||||
)}
|
||||
</>
|
||||
)}
|
||||
</Paper>
|
||||
</>
|
||||
) : (
|
||||
<>
|
||||
<Paper elevation={8} sx={{ padding: 2, marginBottom: 8 }}>
|
||||
<Skeleton variant="text" width="50%" sx={{ fontSize: '2.125rem', margin: 'auto' }} />
|
||||
<Skeleton variant="rounded" width="100%" height={200} />
|
||||
</Paper>
|
||||
<Skeleton variant="rectangular" width="100%" height={500} />
|
||||
</>
|
||||
)}
|
||||
</Container>
|
||||
</>
|
||||
);
|
||||
}
|
||||
|
||||
export default AdfDetails;
|
||||
168
frontend/src/components/adf-new-form.tsx
Normal file
@ -0,0 +1,168 @@
|
||||
import React, {
|
||||
useState, useContext, useCallback, useRef,
|
||||
} from 'react';
|
||||
|
||||
import {
|
||||
Button,
|
||||
Container,
|
||||
FormControl,
|
||||
FormControlLabel,
|
||||
FormLabel,
|
||||
Link,
|
||||
Paper,
|
||||
Radio,
|
||||
RadioGroup,
|
||||
Stack,
|
||||
Typography,
|
||||
TextField,
|
||||
ToggleButtonGroup,
|
||||
ToggleButton,
|
||||
} from '@mui/material';
|
||||
|
||||
import LoadingContext from './loading-context';
|
||||
import SnackbarContext from './snackbar-context';
|
||||
|
||||
import { Parsing } from './adf-details';
|
||||
|
||||
const PLACEHOLDER = `s(a).
|
||||
s(b).
|
||||
s(c).
|
||||
s(d).
|
||||
ac(a,c(v)).
|
||||
ac(b,b).
|
||||
ac(c,and(a,b)).
|
||||
ac(d,neg(b)).`;
|
||||
|
||||
function AdfNewForm({ fetchProblems }: { fetchProblems: () => void; }) {
|
||||
const { setLoading } = useContext(LoadingContext);
|
||||
const { setStatus: setSnackbarInfo } = useContext(SnackbarContext);
|
||||
const [isFileUpload, setFileUpload] = useState(false);
|
||||
const [code, setCode] = useState(PLACEHOLDER);
|
||||
const [filename, setFilename] = useState('');
|
||||
const [parsing, setParsing] = useState<Parsing>('Naive');
|
||||
const [name, setName] = useState('');
|
||||
const fileRef = useRef<HTMLInputElement>(null);
|
||||
|
||||
const addAdf = useCallback(
|
||||
() => {
|
||||
setLoading(true);
|
||||
|
||||
const formData = new FormData();
|
||||
|
||||
if (isFileUpload && fileRef.current) {
|
||||
const file = fileRef.current.files?.[0];
|
||||
if (file) {
|
||||
formData.append('file', file);
|
||||
}
|
||||
} else {
|
||||
formData.append('code', code);
|
||||
}
|
||||
|
||||
formData.append('parsing', parsing);
|
||||
formData.append('name', name);
|
||||
|
||||
fetch(`${process.env.NODE_ENV === 'development' ? '//localhost:8080' : ''}/adf/add`, {
|
||||
method: 'POST',
|
||||
credentials: process.env.NODE_ENV === 'development' ? 'include' : 'same-origin',
|
||||
body: formData,
|
||||
})
|
||||
.then((res) => {
|
||||
switch (res.status) {
|
||||
case 200:
|
||||
setSnackbarInfo({ message: 'Successfully added ADF problem!', severity: 'success', potentialUserChange: true });
|
||||
fetchProblems();
|
||||
break;
|
||||
default:
|
||||
setSnackbarInfo({ message: 'An error occured while adding the ADF problem.', severity: 'error', potentialUserChange: true });
|
||||
break;
|
||||
}
|
||||
})
|
||||
.finally(() => setLoading(false));
|
||||
},
|
||||
[isFileUpload, code, filename, parsing, name, fileRef.current],
|
||||
);
|
||||
|
||||
return (
|
||||
<Container>
|
||||
<Paper elevation={8} sx={{ padding: 2 }}>
|
||||
<Typography variant="h4" component="h2" align="center" gutterBottom>
|
||||
Add a new Problem
|
||||
</Typography>
|
||||
<Container sx={{ marginTop: 2, marginBottom: 2 }}>
|
||||
<Stack direction="row" justifyContent="center">
|
||||
<ToggleButtonGroup
|
||||
value={isFileUpload}
|
||||
exclusive
|
||||
onChange={(_e, newValue) => { setFileUpload(newValue); setFilename(''); }}
|
||||
>
|
||||
<ToggleButton value={false}>
|
||||
Write by Hand
|
||||
</ToggleButton>
|
||||
<ToggleButton value>
|
||||
Upload File
|
||||
</ToggleButton>
|
||||
</ToggleButtonGroup>
|
||||
</Stack>
|
||||
</Container>
|
||||
|
||||
<Container sx={{ marginTop: 2, marginBottom: 2 }}>
|
||||
{isFileUpload ? (
|
||||
<Stack direction="row" justifyContent="center">
|
||||
<Button component="label">
|
||||
{(!!filename && fileRef?.current?.files?.[0]) ? `File '${filename.split(/[\\/]/).pop()}' selected! (Click to change)` : 'Upload File'}
|
||||
<input hidden type="file" onChange={(event) => { setFilename(event.target.value); }} ref={fileRef} />
|
||||
</Button>
|
||||
</Stack>
|
||||
) : (
|
||||
<TextField
|
||||
name="code"
|
||||
label="Put your code here:"
|
||||
helperText={(
|
||||
<>
|
||||
For more info on the syntax, have a
|
||||
look
|
||||
{' '}
|
||||
<Link href="https://github.com/ellmau/adf-obdd" target="_blank" rel="noopener noreferrer">here</Link>
|
||||
.
|
||||
</>
|
||||
)}
|
||||
multiline
|
||||
fullWidth
|
||||
variant="filled"
|
||||
value={code}
|
||||
onChange={(event) => { setCode(event.target.value); }}
|
||||
/>
|
||||
)}
|
||||
</Container>
|
||||
|
||||
<Container sx={{ marginTop: 2 }}>
|
||||
<Stack direction="row" justifyContent="center" spacing={2}>
|
||||
<FormControl>
|
||||
<FormLabel id="parsing-radio-group">Parsing Strategy</FormLabel>
|
||||
<RadioGroup
|
||||
row
|
||||
aria-labelledby="parsing-radio-group"
|
||||
name="parsing"
|
||||
value={parsing}
|
||||
onChange={(e) => setParsing(((e.target as HTMLInputElement).value) as Parsing)}
|
||||
>
|
||||
<FormControlLabel value="Naive" control={<Radio />} label="Naive" />
|
||||
<FormControlLabel value="Hybrid" control={<Radio />} label="Hybrid" />
|
||||
</RadioGroup>
|
||||
</FormControl>
|
||||
<TextField
|
||||
name="name"
|
||||
label="Adf Problem Name (optional):"
|
||||
variant="standard"
|
||||
value={name}
|
||||
onChange={(event) => { setName(event.target.value); }}
|
||||
/>
|
||||
<Button variant="outlined" onClick={() => addAdf()}>Add Adf Problem</Button>
|
||||
</Stack>
|
||||
</Container>
|
||||
</Paper>
|
||||
</Container>
|
||||
);
|
||||
}
|
||||
|
||||
export default AdfNewForm;
|
||||
189
frontend/src/components/adf-overview.tsx
Normal file
@ -0,0 +1,189 @@
|
||||
import React, {
|
||||
useRef, useState, useCallback, useEffect, useContext,
|
||||
} from 'react';
|
||||
|
||||
import {
|
||||
useNavigate,
|
||||
} from 'react-router-dom';
|
||||
|
||||
import {
|
||||
Accordion,
|
||||
AccordionDetails,
|
||||
AccordionSummary,
|
||||
Chip,
|
||||
Container,
|
||||
Paper,
|
||||
TableContainer,
|
||||
Table,
|
||||
TableHead,
|
||||
TableRow,
|
||||
TableCell,
|
||||
TableBody,
|
||||
Typography,
|
||||
} from '@mui/material';
|
||||
|
||||
import ExpandMoreIcon from '@mui/icons-material/ExpandMore';
|
||||
|
||||
import AddInfoMd from 'bundle-text:../help-texts/add-info.md';
|
||||
import Markdown from './markdown';
|
||||
|
||||
import AdfNewForm from './adf-new-form';
|
||||
|
||||
import {
|
||||
AdfProblemInfo,
|
||||
StrategySnakeCase,
|
||||
STRATEGIES_WITHOUT_PARSE,
|
||||
Task,
|
||||
acsWithGraphOptToColor,
|
||||
acsWithGraphOptToText,
|
||||
} from './adf-details';
|
||||
|
||||
import SnackbarContext from './snackbar-context';
|
||||
|
||||
function AdfOverview() {
|
||||
const { status: snackbarInfo } = useContext(SnackbarContext);
|
||||
const [problems, setProblems] = useState<AdfProblemInfo[]>([]);
|
||||
|
||||
const navigate = useNavigate();
|
||||
|
||||
const isFirstRender = useRef(true);
|
||||
|
||||
const fetchProblems = useCallback(
|
||||
() => {
|
||||
fetch(`${process.env.NODE_ENV === 'development' ? '//localhost:8080' : ''}/adf/`, {
|
||||
method: 'GET',
|
||||
credentials: process.env.NODE_ENV === 'development' ? 'include' : 'same-origin',
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
},
|
||||
})
|
||||
.then((res) => {
|
||||
switch (res.status) {
|
||||
case 200:
|
||||
res.json().then((resProblems) => {
|
||||
setProblems(resProblems);
|
||||
});
|
||||
break;
|
||||
case 401:
|
||||
setProblems([]);
|
||||
break;
|
||||
default:
|
||||
break;
|
||||
}
|
||||
});
|
||||
},
|
||||
[setProblems],
|
||||
);
|
||||
|
||||
useEffect(
|
||||
() => {
|
||||
// TODO: having the info if the user may have changed on the snackbar info
|
||||
// is a bit lazy and unclean; be better!
|
||||
if (isFirstRender.current || snackbarInfo?.potentialUserChange) {
|
||||
isFirstRender.current = false;
|
||||
|
||||
fetchProblems();
|
||||
}
|
||||
},
|
||||
[snackbarInfo?.potentialUserChange],
|
||||
);
|
||||
|
||||
useEffect(
|
||||
() => {
|
||||
// if there is a running task, fetch problems again after 20 seconds
|
||||
let timeout: ReturnType<typeof setTimeout>;
|
||||
if (problems.some((p) => p.running_tasks.length > 0)) {
|
||||
timeout = setTimeout(() => fetchProblems(), 20000);
|
||||
}
|
||||
|
||||
return () => {
|
||||
if (timeout) {
|
||||
clearTimeout(timeout);
|
||||
}
|
||||
};
|
||||
},
|
||||
[problems],
|
||||
);
|
||||
|
||||
return (
|
||||
<>
|
||||
<Typography variant="h3" component="h1" align="center" gutterBottom>
|
||||
ADF-BDD.DEV
|
||||
</Typography>
|
||||
<Container sx={{ marginTop: 2, marginBottom: 2 }}>
|
||||
<Accordion>
|
||||
<AccordionSummary expandIcon={<ExpandMoreIcon />}>
|
||||
<span style={{ fontWeight: 'bold' }}>What is this webapp doing and how should I use it?</span>
|
||||
</AccordionSummary>
|
||||
<AccordionDetails>
|
||||
<Markdown>{AddInfoMd}</Markdown>
|
||||
</AccordionDetails>
|
||||
</Accordion>
|
||||
</Container>
|
||||
{problems.length > 0
|
||||
&& (
|
||||
<Container sx={{ marginBottom: 4 }}>
|
||||
<Paper elevation={8} sx={{ padding: 2 }}>
|
||||
<Typography variant="h4" component="h2" align="center" gutterBottom>
|
||||
Existing Problems
|
||||
</Typography>
|
||||
<TableContainer component={Paper}>
|
||||
<Table>
|
||||
<TableHead>
|
||||
<TableRow>
|
||||
<TableCell align="center">ADF Problem Name</TableCell>
|
||||
<TableCell align="center">Parse Status</TableCell>
|
||||
<TableCell align="center">Grounded Solution</TableCell>
|
||||
<TableCell align="center">Complete Solution</TableCell>
|
||||
<TableCell align="center">Stable Solution</TableCell>
|
||||
<TableCell align="center">Stable Solution (Counting Method A)</TableCell>
|
||||
<TableCell align="center">Stable Solution (Counting Method B)</TableCell>
|
||||
<TableCell align="center">Stable Solution (Nogood-Based)</TableCell>
|
||||
</TableRow>
|
||||
</TableHead>
|
||||
<TableBody>
|
||||
{problems.map((problem) => (
|
||||
<TableRow
|
||||
key={problem.name}
|
||||
onClick={() => { navigate(`/${problem.name}`); }}
|
||||
sx={{ '&:last-child td, &:last-child th': { border: 0 }, cursor: 'pointer' }}
|
||||
>
|
||||
<TableCell component="th" scope="row">
|
||||
{problem.name}
|
||||
</TableCell>
|
||||
{
|
||||
(() => {
|
||||
const status = problem.acs_per_strategy.parse_only;
|
||||
const running = problem.running_tasks.some((t: Task) => t.type === 'Parse');
|
||||
|
||||
const color = acsWithGraphOptToColor(status, running);
|
||||
const text = acsWithGraphOptToText(status, running);
|
||||
|
||||
return <TableCell align="center"><Chip color={color} label={`${text} (${problem.parsing_used} Parsing)`} sx={{ cursor: 'inherit' }} /></TableCell>;
|
||||
})()
|
||||
}
|
||||
{
|
||||
STRATEGIES_WITHOUT_PARSE.map((strategy) => {
|
||||
const status = problem.acs_per_strategy[strategy.replace(/^([A-Z])/, (_, p1) => p1.toLowerCase()).replace(/([A-Z])/g, (_, p1) => `_${p1.toLowerCase()}`) as StrategySnakeCase];
|
||||
const running = problem.running_tasks.some((t: Task) => t.type === 'Solve' && t.content === strategy);
|
||||
|
||||
const color = acsWithGraphOptToColor(status, running);
|
||||
const text = acsWithGraphOptToText(status, running);
|
||||
|
||||
return <TableCell key={strategy} align="center"><Chip color={color} label={text} sx={{ cursor: 'inherit' }} /></TableCell>;
|
||||
})
|
||||
}
|
||||
</TableRow>
|
||||
))}
|
||||
</TableBody>
|
||||
</Table>
|
||||
</TableContainer>
|
||||
</Paper>
|
||||
</Container>
|
||||
)}
|
||||
<AdfNewForm fetchProblems={fetchProblems} />
|
||||
</>
|
||||
);
|
||||
}
|
||||
|
||||
export default AdfOverview;
|
||||
155
frontend/src/components/app.tsx
Normal file
@ -0,0 +1,155 @@
|
||||
import React, { useState, useMemo } from 'react';
|
||||
|
||||
import { createBrowserRouter, RouterProvider } from 'react-router-dom';
|
||||
|
||||
import { ThemeProvider, createTheme } from '@mui/material/styles';
|
||||
import {
|
||||
Alert,
|
||||
AlertColor,
|
||||
Backdrop,
|
||||
Container,
|
||||
CircularProgress,
|
||||
CssBaseline,
|
||||
Link,
|
||||
Snackbar,
|
||||
Stack,
|
||||
useMediaQuery,
|
||||
} from '@mui/material';
|
||||
|
||||
import LoadingContext from './loading-context';
|
||||
import SnackbarContext from './snackbar-context';
|
||||
import Footer from './footer';
|
||||
import AdfOverview from './adf-overview';
|
||||
import AdfDetails from './adf-details';
|
||||
|
||||
const browserRouter = createBrowserRouter([
|
||||
{
|
||||
path: '/',
|
||||
element: <AdfOverview />,
|
||||
},
|
||||
{
|
||||
path: '/:adfName',
|
||||
element: <AdfDetails />,
|
||||
},
|
||||
]);
|
||||
|
||||
function App() {
|
||||
const prefersDarkMode = useMediaQuery('(prefers-color-scheme: dark)');
|
||||
|
||||
const theme = useMemo(
|
||||
() => createTheme({
|
||||
palette: {
|
||||
mode: prefersDarkMode ? 'dark' : 'light',
|
||||
},
|
||||
}),
|
||||
[prefersDarkMode],
|
||||
);
|
||||
|
||||
const [loading, setLoading] = useState(false);
|
||||
const loadingContext = useMemo(() => ({ loading, setLoading }), [loading, setLoading]);
|
||||
|
||||
const [snackbarInfo, setSnackbarInfo] = useState<{
|
||||
message: string,
|
||||
severity: AlertColor,
|
||||
potentialUserChange: boolean,
|
||||
} | undefined>();
|
||||
const snackbarContext = useMemo(
|
||||
() => ({ status: snackbarInfo, setStatus: setSnackbarInfo }),
|
||||
[snackbarInfo, setSnackbarInfo],
|
||||
);
|
||||
|
||||
return (
|
||||
<ThemeProvider theme={theme}>
|
||||
<LoadingContext.Provider value={loadingContext}>
|
||||
<SnackbarContext.Provider value={snackbarContext}>
|
||||
<CssBaseline />
|
||||
<main style={{ maxHeight: 'calc(100vh - 70px)', overflowY: 'auto' }}>
|
||||
<RouterProvider router={browserRouter} />
|
||||
|
||||
<Container sx={{ marginTop: 4 }}>
|
||||
<Stack direction="row" justifyContent="center" flexWrap="wrap">
|
||||
<Link href="https://www.innosale.eu/" target="_blank" rel="noopener noreferrer">
|
||||
<img
|
||||
src={new URL('../innosale-logo.png', import.meta.url).toString()}
|
||||
alt="InnoSale Logo"
|
||||
height="40"
|
||||
style={{
|
||||
display: 'inline-block', borderRadius: 4, margin: 2, boxShadow: '0 0 5px 0 rgba(0,0,0,0.4)', padding: 8, background: '#FFFFFF',
|
||||
}}
|
||||
/>
|
||||
</Link>
|
||||
<Link href="https://scads.ai/" target="_blank" rel="noopener noreferrer">
|
||||
<img
|
||||
src={new URL('../scads-logo.png', import.meta.url).toString()}
|
||||
alt="Scads.AI Logo"
|
||||
height="40"
|
||||
style={{
|
||||
display: 'inline-block', borderRadius: 4, margin: 2, boxShadow: '0 0 5px 0 rgba(0,0,0,0.4)', padding: 2, background: '#FFFFFF',
|
||||
}}
|
||||
/>
|
||||
</Link>
|
||||
<Link href="https://secai.org/" target="_blank" rel="noopener noreferrer">
|
||||
<img
|
||||
src={new URL('../secai-logo.png', import.meta.url).toString()}
|
||||
alt="Secai Logo"
|
||||
height="40"
|
||||
style={{
|
||||
display: 'inline-block', borderRadius: 4, margin: 2, boxShadow: '0 0 5px 0 rgba(0,0,0,0.4)',
|
||||
}}
|
||||
/>
|
||||
</Link>
|
||||
<Link href="https://perspicuous-computing.science" target="_blank" rel="noopener noreferrer">
|
||||
<img
|
||||
src={new URL('../cpec-logo.png', import.meta.url).toString()}
|
||||
alt="CPEC Logo"
|
||||
height="40"
|
||||
style={{
|
||||
display: 'inline-block', borderRadius: 4, margin: 2, boxShadow: '0 0 5px 0 rgba(0,0,0,0.4)', padding: 8, background: '#FFFFFF',
|
||||
}}
|
||||
/>
|
||||
</Link>
|
||||
<Link href="https://iccl.inf.tu-dresden.de" target="_blank" rel="noopener noreferrer">
|
||||
<img
|
||||
src={new URL('../iccl-logo.png', import.meta.url).toString()}
|
||||
alt="ICCL Logo"
|
||||
height="40"
|
||||
style={{
|
||||
display: 'inline-block', borderRadius: 4, margin: 2, boxShadow: '0 0 5px 0 rgba(0,0,0,0.4)', padding: 4, background: '#FFFFFF',
|
||||
}}
|
||||
/>
|
||||
</Link>
|
||||
<Link href="https://tu-dresden.de" target="_blank" rel="noopener noreferrer">
|
||||
<img
|
||||
src={new URL('../tud-logo.png', import.meta.url).toString()}
|
||||
alt="TU Dresden Logo"
|
||||
height="40"
|
||||
style={{
|
||||
display: 'inline-block', borderRadius: 4, margin: 2, boxShadow: '0 0 5px 0 rgba(0,0,0,0.4)',
|
||||
}}
|
||||
/>
|
||||
</Link>
|
||||
</Stack>
|
||||
</Container>
|
||||
</main>
|
||||
|
||||
<Footer />
|
||||
|
||||
<Backdrop
|
||||
open={loading}
|
||||
>
|
||||
<CircularProgress color="inherit" />
|
||||
</Backdrop>
|
||||
<Snackbar
|
||||
open={!!snackbarInfo}
|
||||
autoHideDuration={10000}
|
||||
onClose={() => setSnackbarInfo(undefined)}
|
||||
>
|
||||
<Alert severity={snackbarInfo?.severity}>{snackbarInfo?.message}</Alert>
|
||||
</Snackbar>
|
||||
</SnackbarContext.Provider>
|
||||
</LoadingContext.Provider>
|
||||
</ThemeProvider>
|
||||
);
|
||||
}
|
||||
|
||||
export default App;
|
||||
247
frontend/src/components/footer.tsx
Normal file
@ -0,0 +1,247 @@
|
||||
import React, {
|
||||
useState, useCallback, useContext, useEffect, useRef,
|
||||
} from 'react';
|
||||
|
||||
import {
|
||||
AlertColor,
|
||||
Alert,
|
||||
AppBar,
|
||||
Box,
|
||||
Button,
|
||||
Dialog,
|
||||
DialogActions,
|
||||
DialogContent,
|
||||
DialogTitle,
|
||||
Link,
|
||||
TextField,
|
||||
Toolbar,
|
||||
} from '@mui/material';
|
||||
|
||||
import LoadingContext from './loading-context';
|
||||
import SnackbarContext from './snackbar-context';
|
||||
|
||||
enum UserFormType {
|
||||
Login = 'Login',
|
||||
Register = 'Register',
|
||||
Update = 'Update',
|
||||
}
|
||||
|
||||
interface UserFormProps {
|
||||
formType: UserFormType | null;
|
||||
close: (message?: string, severity?: AlertColor) => void;
|
||||
username?: string;
|
||||
}
|
||||
|
||||
function UserForm({ username: propUsername, formType, close }: UserFormProps) {
|
||||
const { setLoading } = useContext(LoadingContext);
|
||||
const [username, setUsername] = useState<string>(propUsername || '');
|
||||
const [password, setPassword] = useState<string>('');
|
||||
const [errorOccurred, setError] = useState<boolean>(false);
|
||||
|
||||
const submitHandler = useCallback(
|
||||
(del: boolean) => {
|
||||
setLoading(true);
|
||||
setError(false);
|
||||
|
||||
let method; let
|
||||
endpoint;
|
||||
if (del) {
|
||||
method = 'DELETE';
|
||||
endpoint = '/users/delete';
|
||||
} else {
|
||||
switch (formType) {
|
||||
case UserFormType.Login:
|
||||
method = 'POST';
|
||||
endpoint = '/users/login';
|
||||
break;
|
||||
case UserFormType.Register:
|
||||
method = 'POST';
|
||||
endpoint = '/users/register';
|
||||
break;
|
||||
case UserFormType.Update:
|
||||
method = 'PUT';
|
||||
endpoint = '/users/update';
|
||||
break;
|
||||
default:
|
||||
// NOTE: the value is not null when the dialog is open
|
||||
break;
|
||||
}
|
||||
}
|
||||
|
||||
fetch(`${process.env.NODE_ENV === 'development' ? '//localhost:8080' : ''}${endpoint}`, {
|
||||
method,
|
||||
credentials: process.env.NODE_ENV === 'development' ? 'include' : 'same-origin',
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
},
|
||||
body: !del ? JSON.stringify({ username, password }) : undefined,
|
||||
})
|
||||
.then((res) => {
|
||||
switch (res.status) {
|
||||
case 200:
|
||||
close(`Action '${del ? 'Delete' : formType}' successful!`, 'success');
|
||||
break;
|
||||
default:
|
||||
setError(true);
|
||||
break;
|
||||
}
|
||||
})
|
||||
.finally(() => setLoading(false));
|
||||
},
|
||||
[username, password, formType],
|
||||
);
|
||||
|
||||
return (
|
||||
<form onSubmit={(e) => { e.preventDefault(); submitHandler(false); }}>
|
||||
<DialogTitle>{formType}</DialogTitle>
|
||||
<DialogContent>
|
||||
<TextField
|
||||
variant="standard"
|
||||
type="text"
|
||||
label="Username"
|
||||
value={username}
|
||||
onChange={(event) => { setUsername(event.target.value); }}
|
||||
/>
|
||||
<br />
|
||||
<TextField
|
||||
variant="standard"
|
||||
type="password"
|
||||
label="Password"
|
||||
value={password}
|
||||
onChange={(event) => { setPassword(event.target.value); }}
|
||||
/>
|
||||
{errorOccurred
|
||||
&& <Alert severity="error">Check your inputs!</Alert>}
|
||||
</DialogContent>
|
||||
<DialogActions>
|
||||
<Button type="button" onClick={() => close()}>Cancel</Button>
|
||||
<Button type="submit" variant="contained" color="success">{formType}</Button>
|
||||
{formType === UserFormType.Update
|
||||
// TODO: add another confirm dialog here
|
||||
&& (
|
||||
<Button
|
||||
type="button"
|
||||
variant="outlined"
|
||||
color="error"
|
||||
onClick={() => {
|
||||
// eslint-disable-next-line no-alert
|
||||
if (window.confirm('Are you sure that you want to delete your account?')) {
|
||||
submitHandler(true);
|
||||
}
|
||||
}}
|
||||
>
|
||||
Delete Account
|
||||
</Button>
|
||||
)}
|
||||
</DialogActions>
|
||||
</form>
|
||||
);
|
||||
}
|
||||
|
||||
UserForm.defaultProps = { username: undefined };
|
||||
|
||||
function Footer() {
|
||||
const { status: snackbarInfo, setStatus: setSnackbarInfo } = useContext(SnackbarContext);
|
||||
const [username, setUsername] = useState<string>();
|
||||
const [tempUser, setTempUser] = useState<boolean>();
|
||||
const [dialogTypeOpen, setDialogTypeOpen] = useState<UserFormType | null>(null);
|
||||
|
||||
const isFirstRender = useRef(true);
|
||||
|
||||
const logout = useCallback(() => {
|
||||
fetch(`${process.env.NODE_ENV === 'development' ? '//localhost:8080' : ''}/users/logout`, {
|
||||
method: 'DELETE',
|
||||
credentials: process.env.NODE_ENV === 'development' ? 'include' : 'same-origin',
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
},
|
||||
})
|
||||
.then((res) => {
|
||||
switch (res.status) {
|
||||
case 200:
|
||||
setSnackbarInfo({ message: 'Logout successful!', severity: 'success', potentialUserChange: true });
|
||||
setUsername(undefined);
|
||||
break;
|
||||
default:
|
||||
setSnackbarInfo({ message: 'An error occurred while trying to log out.', severity: 'error', potentialUserChange: false });
|
||||
break;
|
||||
}
|
||||
});
|
||||
}, [setSnackbarInfo]);
|
||||
|
||||
useEffect(() => {
|
||||
// TODO: having the info if the user may have changed on the snackbar info
|
||||
// is a bit lazy and unclean; be better!
|
||||
if (isFirstRender.current || snackbarInfo?.potentialUserChange) {
|
||||
isFirstRender.current = false;
|
||||
|
||||
fetch(`${process.env.NODE_ENV === 'development' ? '//localhost:8080' : ''}/users/info`, {
|
||||
method: 'GET',
|
||||
credentials: process.env.NODE_ENV === 'development' ? 'include' : 'same-origin',
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
},
|
||||
})
|
||||
.then((res) => {
|
||||
switch (res.status) {
|
||||
case 200:
|
||||
res.json().then(({ username: user, temp }) => {
|
||||
setUsername(user);
|
||||
setTempUser(temp);
|
||||
});
|
||||
break;
|
||||
default:
|
||||
setUsername(undefined);
|
||||
break;
|
||||
}
|
||||
});
|
||||
}
|
||||
}, [snackbarInfo?.potentialUserChange]);
|
||||
|
||||
return (
|
||||
<>
|
||||
<AppBar position="fixed" sx={{ top: 'auto', bottom: 0 }}>
|
||||
<Toolbar sx={{ justifyContent: 'center', alignItems: 'center' }}>
|
||||
<Box sx={{ flexGrow: 1 }}>
|
||||
{username ? (
|
||||
<>
|
||||
<span>
|
||||
Logged in as:
|
||||
{' '}
|
||||
{username}
|
||||
{' '}
|
||||
{tempUser ? '(Temporary User. Edit to set a password!)' : undefined}
|
||||
</span>
|
||||
<Button color="inherit" onClick={() => setDialogTypeOpen(UserFormType.Update)}>Edit</Button>
|
||||
{!tempUser && <Button color="inherit" onClick={() => logout()}>Logout</Button>}
|
||||
</>
|
||||
) : (
|
||||
<>
|
||||
<Button color="inherit" onClick={() => setDialogTypeOpen(UserFormType.Login)}>Login</Button>
|
||||
<Button color="inherit" onClick={() => setDialogTypeOpen(UserFormType.Register)}>Register</Button>
|
||||
</>
|
||||
)}
|
||||
</Box>
|
||||
|
||||
<Link color="inherit" href="/legal.html" target="_blank" sx={{ fontSize: '0.8rem' }}>
|
||||
Legal Information (Impressum and Data Protection Regulation)
|
||||
</Link>
|
||||
</Toolbar>
|
||||
</AppBar>
|
||||
<Dialog open={!!dialogTypeOpen} onClose={() => setDialogTypeOpen(null)}>
|
||||
<UserForm
|
||||
formType={dialogTypeOpen}
|
||||
close={(message, severity) => {
|
||||
setDialogTypeOpen(null);
|
||||
setSnackbarInfo((!!message && !!severity)
|
||||
? { message, severity, potentialUserChange: true }
|
||||
: undefined);
|
||||
}}
|
||||
username={dialogTypeOpen === UserFormType.Update ? username : undefined}
|
||||
/>
|
||||
</Dialog>
|
||||
</>
|
||||
);
|
||||
}
|
||||
|
||||
export default Footer;
|
||||
381
frontend/src/components/graph-g6.tsx
Normal file
@ -0,0 +1,381 @@
|
||||
import React, { useEffect, useRef } from 'react';
|
||||
|
||||
import G6, { Graph } from '@antv/g6';
|
||||
|
||||
G6.registerNode('nodeWithFlag', {
|
||||
draw(cfg, group) {
|
||||
const mainWidth = Math.max(30, 5 * (cfg!.mainLabel as string).length + 10);
|
||||
const mainHeight = 30;
|
||||
|
||||
const keyShape = group!.addShape('rect', {
|
||||
attrs: {
|
||||
width: mainWidth,
|
||||
height: mainHeight,
|
||||
radius: 2,
|
||||
fill: 'white',
|
||||
stroke: 'black',
|
||||
cursor: 'pointer',
|
||||
},
|
||||
name: 'rectMainLabel',
|
||||
draggable: true,
|
||||
});
|
||||
|
||||
group!.addShape('text', {
|
||||
attrs: {
|
||||
x: mainWidth / 2,
|
||||
y: mainHeight / 2,
|
||||
textAlign: 'center',
|
||||
textBaseline: 'middle',
|
||||
text: cfg!.mainLabel,
|
||||
fill: '#212121',
|
||||
fontFamily: 'Roboto',
|
||||
cursor: 'pointer',
|
||||
},
|
||||
// must be assigned in G6 3.3 and later versions. it can be any value you want
|
||||
name: 'textMailLabel',
|
||||
// allow the shape to response the drag events
|
||||
draggable: true,
|
||||
});
|
||||
|
||||
if (cfg!.subLabel) {
|
||||
const subWidth = 5 * (cfg!.subLabel as string).length + 4;
|
||||
const subHeight = 20;
|
||||
|
||||
const subRectX = mainWidth - 4;
|
||||
const subRectY = -subHeight + 4;
|
||||
|
||||
group!.addShape('rect', {
|
||||
attrs: {
|
||||
x: subRectX,
|
||||
y: subRectY,
|
||||
width: subWidth,
|
||||
height: subHeight,
|
||||
radius: 1,
|
||||
fill: '#4caf50',
|
||||
stroke: '#1b5e20',
|
||||
cursor: 'pointer',
|
||||
},
|
||||
name: 'rectMainLabel',
|
||||
draggable: true,
|
||||
});
|
||||
|
||||
group!.addShape('text', {
|
||||
attrs: {
|
||||
x: subRectX + subWidth / 2,
|
||||
y: subRectY + subHeight / 2,
|
||||
textAlign: 'center',
|
||||
textBaseline: 'middle',
|
||||
text: cfg!.subLabel,
|
||||
fill: '#212121',
|
||||
fontFamily: 'Roboto',
|
||||
fontSize: 10,
|
||||
cursor: 'pointer',
|
||||
},
|
||||
// must be assigned in G6 3.3 and later versions. it can be any value you want
|
||||
name: 'textMailLabel',
|
||||
// allow the shape to response the drag events
|
||||
draggable: true,
|
||||
});
|
||||
}
|
||||
|
||||
return keyShape;
|
||||
},
|
||||
getAnchorPoints() {
|
||||
return [[0.5, 0], [0, 0.5], [1, 0.5], [0.5, 1]];
|
||||
},
|
||||
// nodeStateStyles: {
|
||||
// hover: {
|
||||
// fill: 'lightsteelblue',
|
||||
// },
|
||||
// highlight: {
|
||||
// lineWidth: 3,
|
||||
// },
|
||||
// lowlight: {
|
||||
// opacity: 0.3,
|
||||
// },
|
||||
// },
|
||||
setState(name, value, item) {
|
||||
if (!item) { return; }
|
||||
const group = item.getContainer();
|
||||
const mainShape = group.get('children')[0]; // Find the first graphics shape of the node. It is determined by the order of being added
|
||||
const subShape = group.get('children')[2];
|
||||
|
||||
if (name === 'hover') {
|
||||
if (value) {
|
||||
mainShape.attr('fill', 'lightsteelblue');
|
||||
} else {
|
||||
mainShape.attr('fill', 'white');
|
||||
}
|
||||
}
|
||||
|
||||
if (name === 'highlight') {
|
||||
if (value) {
|
||||
mainShape.attr('lineWidth', 3);
|
||||
} else {
|
||||
mainShape.attr('lineWidth', 1);
|
||||
}
|
||||
}
|
||||
|
||||
if (name === 'lowlight') {
|
||||
if (value) {
|
||||
mainShape.attr('opacity', 0.3);
|
||||
if (subShape) {
|
||||
subShape.attr('opacity', 0.3);
|
||||
}
|
||||
} else {
|
||||
mainShape.attr('opacity', 1);
|
||||
if (subShape) {
|
||||
subShape.attr('opacity', 1);
|
||||
}
|
||||
}
|
||||
}
|
||||
},
|
||||
});
|
||||
|
||||
export interface GraphProps {
|
||||
lo_edges: [string, string][],
|
||||
hi_edges: [string, string][],
|
||||
node_labels: { [key: string]: string },
|
||||
tree_root_labels: { [key: string]: string[] },
|
||||
}
|
||||
|
||||
function nodesAndEdgesFromGraphProps(graphProps: GraphProps) {
|
||||
const nodes = Object.keys(graphProps.node_labels).map((id) => {
|
||||
const mainLabel = graphProps.node_labels[id];
|
||||
const subLabel = graphProps.tree_root_labels[id].length > 0 ? `Root for: ${graphProps.tree_root_labels[id].join(' ; ')}` : undefined;
|
||||
|
||||
// const label = subLabel.length > 0 ? `${mainLabel}\n${subLabel}` : mainLabel;
|
||||
|
||||
return {
|
||||
id: id.toString(),
|
||||
mainLabel,
|
||||
subLabel,
|
||||
// style: {
|
||||
// height: subLabel.length > 0 ? 60 : 30,
|
||||
// width: Math.max(30, 5 * mainLabel.length + 10, 5 * subLabel.length + 10),
|
||||
// },
|
||||
};
|
||||
});
|
||||
const edges = graphProps.lo_edges.map(([source, target]) => ({
|
||||
id: `LO_${source}_${target}`, source: source.toString(), target: target.toString(), style: { stroke: '#ed6c02', lineWidth: 2 },
|
||||
}))
|
||||
.concat(graphProps.hi_edges.map(([source, target]) => ({
|
||||
id: `HI_${source}_${target}`, source: source.toString(), target: target.toString(), style: { stroke: '#1976d2', lineWidth: 2 },
|
||||
})));
|
||||
|
||||
return { nodes, edges };
|
||||
}
|
||||
|
||||
interface Props {
|
||||
graph: GraphProps,
|
||||
}
|
||||
|
||||
function GraphG6(props: Props) {
|
||||
const { graph: graphProps } = props;
|
||||
|
||||
const ref = useRef(null);
|
||||
|
||||
const graphRef = useRef<Graph>();
|
||||
|
||||
useEffect(
|
||||
() => {
|
||||
if (!graphRef.current) {
|
||||
graphRef.current = new Graph({
|
||||
container: ref.current!,
|
||||
height: 800,
|
||||
fitView: true,
|
||||
modes: {
|
||||
default: ['drag-canvas', 'zoom-canvas', 'drag-node'],
|
||||
},
|
||||
layout: {
|
||||
type: 'dagre',
|
||||
rankdir: 'BT',
|
||||
},
|
||||
// defaultNode: {
|
||||
// anchorPoints: [[0.5, 0], [0, 0.5], [1, 0.5], [0.5, 1]],
|
||||
// type: 'rect',
|
||||
// style: {
|
||||
// radius: 2,
|
||||
// },
|
||||
// labelCfg: {
|
||||
// style: {
|
||||
/// / fontWeight: 700,
|
||||
// fontFamily: 'Roboto',
|
||||
// },
|
||||
// },
|
||||
// },
|
||||
defaultNode: { type: 'nodeWithFlag' },
|
||||
defaultEdge: {
|
||||
style: {
|
||||
endArrow: true,
|
||||
},
|
||||
},
|
||||
// nodeStateStyles: {
|
||||
// hover: {
|
||||
// fill: 'lightsteelblue',
|
||||
// },
|
||||
// highlight: {
|
||||
// lineWidth: 3,
|
||||
// },
|
||||
// lowlight: {
|
||||
// opacity: 0.3,
|
||||
// },
|
||||
// },
|
||||
edgeStateStyles: {
|
||||
lowlight: {
|
||||
opacity: 0.3,
|
||||
},
|
||||
},
|
||||
animate: true,
|
||||
animateCfg: {
|
||||
duration: 500,
|
||||
easing: 'easePolyInOut',
|
||||
},
|
||||
});
|
||||
}
|
||||
|
||||
const graph = graphRef.current;
|
||||
|
||||
// Mouse enter a node
|
||||
graph.on('node:mouseenter', (e) => {
|
||||
const nodeItem = e.item!; // Get the target item
|
||||
graph.setItemState(nodeItem, 'hover', true); // Set the state 'hover' of the item to be true
|
||||
});
|
||||
|
||||
// Mouse leave a node
|
||||
graph.on('node:mouseleave', (e) => {
|
||||
const nodeItem = e.item!; // Get the target item
|
||||
graph.setItemState(nodeItem, 'hover', false); // Set the state 'hover' of the item to be false
|
||||
});
|
||||
},
|
||||
[],
|
||||
);
|
||||
|
||||
useEffect(
|
||||
() => {
|
||||
const graph = graphRef.current!;
|
||||
|
||||
// Click a node
|
||||
graph.on('node:click', (e) => {
|
||||
const nodeItem = e.item!; // et the clicked item
|
||||
|
||||
let onlyRemoveStates = false;
|
||||
if (nodeItem.hasState('highlight')) {
|
||||
onlyRemoveStates = true;
|
||||
}
|
||||
|
||||
const clickNodes = graph.findAllByState('node', 'highlight');
|
||||
clickNodes.forEach((cn) => {
|
||||
graph.setItemState(cn, 'highlight', false);
|
||||
});
|
||||
|
||||
const lowlightNodes = graph.findAllByState('node', 'lowlight');
|
||||
lowlightNodes.forEach((cn) => {
|
||||
graph.setItemState(cn, 'lowlight', false);
|
||||
});
|
||||
const lowlightEdges = graph.findAllByState('edge', 'lowlight');
|
||||
lowlightEdges.forEach((cn) => {
|
||||
graph.setItemState(cn, 'lowlight', false);
|
||||
});
|
||||
|
||||
if (onlyRemoveStates) {
|
||||
return;
|
||||
}
|
||||
|
||||
graph.getNodes().forEach((node) => {
|
||||
graph.setItemState(node, 'lowlight', true);
|
||||
});
|
||||
graph.getEdges().forEach((edge) => {
|
||||
graph.setItemState(edge, 'lowlight', true);
|
||||
});
|
||||
|
||||
const relevantNodeIds: string[] = [];
|
||||
const relevantLoEdges: [string, string][] = [];
|
||||
const relevantHiEdges: [string, string][] = [];
|
||||
let newNodeIds: string[] = [nodeItem.getModel().id!];
|
||||
let newLoEdges: [string, string][] = [];
|
||||
let newHiEdges: [string, string][] = [];
|
||||
|
||||
while (newNodeIds.length > 0 || newLoEdges.length > 0 || newHiEdges.length > 0) {
|
||||
relevantNodeIds.push(...newNodeIds);
|
||||
relevantLoEdges.push(...newLoEdges);
|
||||
relevantHiEdges.push(...newHiEdges);
|
||||
|
||||
newLoEdges = graphProps.lo_edges
|
||||
.filter((edge) => relevantNodeIds.includes(edge[0].toString())
|
||||
&& !relevantLoEdges.includes(edge));
|
||||
newHiEdges = graphProps.hi_edges
|
||||
.filter((edge) => relevantNodeIds.includes(edge[0].toString())
|
||||
&& !relevantHiEdges.includes(edge));
|
||||
|
||||
newNodeIds = newLoEdges
|
||||
.concat(newHiEdges)
|
||||
.map((edge) => edge[1].toString())
|
||||
.filter((id) => !relevantNodeIds.includes(id));
|
||||
}
|
||||
|
||||
const relevantEdgeIds = relevantLoEdges
|
||||
.map(([source, target]) => `LO_${source}_${target}`)
|
||||
.concat(
|
||||
relevantHiEdges
|
||||
.map(([source, target]) => `HI_${source}_${target}`),
|
||||
);
|
||||
|
||||
relevantNodeIds
|
||||
.forEach((id) => {
|
||||
graph.setItemState(id, 'lowlight', false);
|
||||
graph.setItemState(id, 'highlight', true);
|
||||
});
|
||||
|
||||
relevantEdgeIds
|
||||
.forEach((id) => {
|
||||
graph.setItemState(id, 'lowlight', false);
|
||||
});
|
||||
|
||||
// graph.setItemState(nodeItem, 'lowlight', false);
|
||||
// graph.setItemState(nodeItem, 'highlight', true);
|
||||
// nodeItem.getEdges().forEach((edge) => {
|
||||
// graph.setItemState(edge, 'lowlight', false);
|
||||
// });
|
||||
});
|
||||
|
||||
return () => { graph.off('node:click'); };
|
||||
},
|
||||
[graphProps],
|
||||
);
|
||||
|
||||
useEffect(
|
||||
() => {
|
||||
const graph = graphRef.current!;
|
||||
|
||||
const { nodes, edges } = nodesAndEdgesFromGraphProps(graphProps);
|
||||
|
||||
graph.changeData({
|
||||
nodes,
|
||||
edges,
|
||||
});
|
||||
},
|
||||
[graphProps],
|
||||
);
|
||||
|
||||
return (
|
||||
<>
|
||||
<div ref={ref} style={{ overflow: 'hidden' }} />
|
||||
<div style={{ padding: 4 }}>
|
||||
<span style={{ color: '#ed6c02', marginRight: 8 }}>lo edge (condition is false)</span>
|
||||
{' '}
|
||||
<span style={{ color: '#1976d2', marginRight: 8 }}>hi edge (condition is true)</span>
|
||||
{' '}
|
||||
Click nodes to hightlight paths! (You can also drag and zoom.)
|
||||
<br />
|
||||
The
|
||||
{' '}
|
||||
<span style={{ color: '#4caf50' }}>Root for: X</span>
|
||||
{' '}
|
||||
labels indicate where to start looking to determine the truth value of statement X.
|
||||
</div>
|
||||
</>
|
||||
);
|
||||
}
|
||||
|
||||
export default GraphG6;
|
||||
13
frontend/src/components/loading-context.ts
Normal file
@ -0,0 +1,13 @@
|
||||
import { createContext } from 'react';
|
||||
|
||||
interface ILoadingContext {
|
||||
loading: boolean;
|
||||
setLoading: (loading: boolean) => void;
|
||||
}
|
||||
|
||||
const LoadingContext = createContext<ILoadingContext>({
|
||||
loading: false,
|
||||
setLoading: () => {},
|
||||
});
|
||||
|
||||
export default LoadingContext;
|
||||
58
frontend/src/components/markdown.tsx
Normal file
@ -0,0 +1,58 @@
|
||||
import React from 'react';
|
||||
import ReactMarkdown from 'markdown-to-jsx';
|
||||
import {
|
||||
Box,
|
||||
Link,
|
||||
Typography,
|
||||
} from '@mui/material';
|
||||
|
||||
const options = {
|
||||
overrides: {
|
||||
h1: {
|
||||
component: Typography,
|
||||
props: {
|
||||
gutterBottom: true,
|
||||
variant: 'h4',
|
||||
},
|
||||
},
|
||||
h2: {
|
||||
component: Typography,
|
||||
props: { gutterBottom: true, variant: 'h6' },
|
||||
},
|
||||
h3: {
|
||||
component: Typography,
|
||||
props: { gutterBottom: true, variant: 'subtitle1' },
|
||||
},
|
||||
h4: {
|
||||
component: Typography,
|
||||
props: {
|
||||
gutterBottom: true,
|
||||
variant: 'caption',
|
||||
paragraph: true,
|
||||
},
|
||||
},
|
||||
p: {
|
||||
component: Typography,
|
||||
props: { paragraph: true, sx: { '&:last-child': { marginBottom: 0 } } },
|
||||
},
|
||||
a: {
|
||||
component: (props: any) => (
|
||||
// eslint-disable-next-line react/jsx-props-no-spreading
|
||||
<Link target="_blank" rel="noopener noreferrer" {...props} />
|
||||
),
|
||||
},
|
||||
li: {
|
||||
component: (props: any) => (
|
||||
<Box component="li" sx={{ mt: 1 }}>
|
||||
{/* eslint-disable-next-line react/jsx-props-no-spreading */}
|
||||
<Typography component="span" {...props} />
|
||||
</Box>
|
||||
),
|
||||
},
|
||||
},
|
||||
};
|
||||
|
||||
export default function Markdown(props: any) {
|
||||
// eslint-disable-next-line react/jsx-props-no-spreading
|
||||
return <ReactMarkdown options={options} {...props} />;
|
||||
}
|
||||
17
frontend/src/components/snackbar-context.ts
Normal file
@ -0,0 +1,17 @@
|
||||
import { createContext } from 'react';
|
||||
|
||||
import { AlertColor } from '@mui/material';
|
||||
|
||||
type Status = { message: string, severity: AlertColor, potentialUserChange: boolean } | undefined;
|
||||
|
||||
interface ISnackbarContext {
|
||||
status: Status;
|
||||
setStatus: (status: Status) => void;
|
||||
}
|
||||
|
||||
const SnackbarContext = createContext<ISnackbarContext>({
|
||||
status: undefined,
|
||||
setStatus: () => {},
|
||||
});
|
||||
|
||||
export default SnackbarContext;
|
||||
BIN
frontend/src/cpec-logo.png
Normal file
|
After Width: | Height: | Size: 47 KiB |
23
frontend/src/help-texts/add-info.md
Normal file
@ -0,0 +1,23 @@
|
||||
ADF-BDD.dev allows you to solve Abstract Dialectical Frameworks (ADFs). The ADFs are represented as Binary Decision Diagrams (BDDs).
|
||||
The Web UI mimics many options of the CLI version of the [underlying adf-bdd tool](https://github.com/ellmau/adf-obdd). The syntax for the ADF code is indentical.
|
||||
|
||||
In the below form, you can either type/paste your `code` or upload a file in the same format.
|
||||
To put it briefly, an ADF consists of statements and accectance conditions for these statements.
|
||||
For instance, the following code indicates that `a,b,c,d` are statements, that `a` is assumed to be true (verum), `b` is true if is true (which is tautological), `c` is true if `a` and `b` are true, and `d` is true if `b` is false.
|
||||
|
||||
```
|
||||
s(a).
|
||||
s(b).
|
||||
s(c).
|
||||
s(d).
|
||||
ac(a,c(v)).
|
||||
ac(b,b).
|
||||
ac(c,and(a,b)).
|
||||
ac(d,neg(b)).
|
||||
```
|
||||
|
||||
Internally, the ADF is respresented as a BDD.
|
||||
The `Parsing Strategy` determines the internal implementation used for these. `Naive` uses the own BDD implementation of our tool. `Hybrid` mixes our approaches with the existing Rust BDD library [`biodivine`](https://crates.io/crates/biodivine-lib-bdd). Don't be concerned about this choice if you are new to this tool; just pick either one.
|
||||
You will get a view on the BDD in the detail view after you added the problem.
|
||||
|
||||
You can optionally set a name for you ADF problem. Otherwise a random name will be chosen. At the moment the name cannot be changed later (but you could remove and re-add the problem).
|
||||
13
frontend/src/help-texts/detail-info.md
Normal file
@ -0,0 +1,13 @@
|
||||
First of all you can review the code that you added. You can also delete the problem if you made a mistake or do not need it anymore.
|
||||
|
||||
Further below, you can have a look at the BDD representations of your problem using different semantics.
|
||||
In principle, each statement gets it's own BDD that indicates how its truth value can be obtained from the other ones. Note that every BDD has the `BOT` and `TOP` nodes ultimately indicating the truth value (false or true respectively).
|
||||
All these individual BDDs are displayed in a merged representation where the `Root for:` labels tell you where to start looking if you want to
|
||||
get the truth value of an individual statement.
|
||||
For instance, consider a BDD that (besides `BOT` and `TOP`) only contains a node `b` annotated with `Root for: a` and the annotation `Root for: b` at the `TOP` node.
|
||||
Since the root for `b` is the `TOP` node, we know that `b` must be true. Then, to obtain the truth value for `a`, we start at the `b` and since we know that `b` must be true, we can follow the blue edge to obtain the value for `a` (we will end up in `BOT` or `TOP` there). If `b` would be false, we would follow the orange edge analogously. Note that is not always possible to directly determine the truth values of statements (which is exactly why we need tools like this).
|
||||
|
||||
On the very left, you can view the initial representation of your problem after parsing. This also indicates the parsing strategy that you have chosen (`Naive` or `Hybrid`).
|
||||
The other tabs allow you to solve the problem using different semantics and optimizations. Some of them (e.g. `complete`) may produce multiple models that you can cycle through.
|
||||
To get a better idea of the differences, you can have a look at the [command line tool](https://github.com/ellmau/adf-obdd/tree/main/bin).
|
||||
|
||||
BIN
frontend/src/help-texts/example-bdd.png
Normal file
|
After Width: | Height: | Size: 51 KiB |
BIN
frontend/src/iccl-logo.png
Normal file
|
After Width: | Height: | Size: 22 KiB |
18
frontend/src/index.html
Normal file
@ -0,0 +1,18 @@
|
||||
<!doctype html>
|
||||
<html lang="en">
|
||||
<head>
|
||||
<meta charset="utf-8"/>
|
||||
<meta name="viewport" content="initial-scale=1, width=device-width" />
|
||||
<title>ADF-OBDD Web Visualizer</title>
|
||||
<script type="module" src="app.tsx"></script>
|
||||
</head>
|
||||
<body>
|
||||
<noscript>
|
||||
<h1>ADF-BDD.DEV</h1>
|
||||
<p>Turn on Javascript in your browser to use our ADF tool!</p>
|
||||
<a href="./legal.html" target="_blank">Legal Information (Impressum and Data Protection Regulation)</a>
|
||||
</noscript>
|
||||
<div id="app"></div>
|
||||
</body>
|
||||
</html>
|
||||
|
||||
BIN
frontend/src/innosale-logo.png
Normal file
|
After Width: | Height: | Size: 8.7 KiB |
99
frontend/src/legal.html
Normal file
@ -0,0 +1,99 @@
|
||||
<!DOCTYPE html>
|
||||
<html>
|
||||
<head>
|
||||
<title>ADF-BDD.dev - Legal Notice</title>
|
||||
<meta name="description" content="Impressum and Data Protection Regulation for adf-bdd.dev">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1">
|
||||
<style>
|
||||
body {
|
||||
font-family: Helvetica;
|
||||
}
|
||||
h1 {
|
||||
text-align: center;
|
||||
}
|
||||
section {
|
||||
max-width: 1000px;
|
||||
margin: 0 auto 32px;
|
||||
padding: 16px;
|
||||
box-shadow: 0 0 10px 0px rgba(0,0,0,0.4);
|
||||
}
|
||||
section > :first-child {
|
||||
margin-top: 0;
|
||||
}
|
||||
section > :last-child {
|
||||
margin-bottom: 0;
|
||||
}
|
||||
</style>
|
||||
</head>
|
||||
<body>
|
||||
|
||||
<header>
|
||||
<h1>ADF-BDD.DEV Legal Notice</h1>
|
||||
</header>
|
||||
|
||||
<section>
|
||||
<h2>Impressum</h2>
|
||||
|
||||
The <a href="https://tu-dresden.de/impressum?set_language=en" target="_blank" rel="noreferrer noopener">Impressum of TU Dresden</a> applies with the following amendments:
|
||||
|
||||
<h3>Responsibilities - Content and Technical Implementation</h3>
|
||||
|
||||
<p>
|
||||
Dipl.-Inf. Lukas Gerlach<br>
|
||||
Technische Universität Dresden<br>
|
||||
Fakultät Informatik<br>
|
||||
Institut für Theoretische Informatik<br>
|
||||
Professur für Wissensbasierte Systeme<br>
|
||||
01062 Dresden<br>
|
||||
GERMANY
|
||||
</p>
|
||||
<p>
|
||||
Email: lukas.gerlach@tu-dresden.de<br>
|
||||
Phone: (+49) 351 / 463 43503
|
||||
</p>
|
||||
</section>
|
||||
|
||||
<section>
|
||||
<h2>Data Protection Regulation</h2>
|
||||
<p>
|
||||
We process your personal data only in form of metadata that is send to us when you access the website.
|
||||
This is done to pursue our legitimate interest of providing and improving this publicly available website (https://adf-bdd.dev).
|
||||
To this aim, this metadata is also written to server log files.
|
||||
The data may contain the following of your personal information: public IP address, time of access, internet browser (e.g. user agent, version), operating system, referrer url, hostname of requesting machine.
|
||||
We only set cookies that are necessary for the provision of our service, i.e. to check if a user is logged in.
|
||||
We do not set any so-called tracking cookies and we do not use any third party analytics tools on this website.
|
||||
</p>
|
||||
<h3>Legal basis</h3>
|
||||
<p>
|
||||
The legal basis for the data processing is <a href="https://gdpr.eu/article-6-how-to-process-personal-data-legally/" target="_blank" rel="noreferrer noopener">Section §6 para.1 lit. f GDPR</a>.
|
||||
</p>
|
||||
<h3>Rights of data subjects</h3>
|
||||
<ul>
|
||||
<li>You have the right to obtain information from TU Dresden about the data stored about your person and/or to have incorrectly stored data corrected.</li>
|
||||
<li>You have the right to erasure or restriction of the processing and/or a right to object to the processing.</li>
|
||||
<li>You can contact TU Dresden's Data Protection Officer at any time.
|
||||
<p>
|
||||
Tel.: +49 351 / 463 32839<br>
|
||||
Fax: +49 351 / 463 39718<br>
|
||||
Email: informationssicherheit@tu-dresden.de<br>
|
||||
<a href="https://tu-dresden.de/informationssicherheit" target="_blank" rel="noreferrer noopener">https://tu-dresden.de/informationssicherheit</a>
|
||||
</p>
|
||||
</li>
|
||||
<li>
|
||||
You also have the right to complain to a supervisory authority if you are concerned that the processing of your personal data is an infringement of the law. The competent supervisory authority for data protection is:
|
||||
<p>
|
||||
Saxon Data Protection Commissioner<br>
|
||||
Ms. Dr. Juliane Hundert<br>
|
||||
Devrientstraße 5<br>
|
||||
01067 Dresden<br>
|
||||
Email: saechsdsb@slt.sachsen.de<br>
|
||||
Phone: + 49 351 / 85471 101<br>
|
||||
<a href="http://www.datenschutz.sachsen.de" target="_blank" rel="noreferrer noopener">www.datenschutz.sachsen.de</a>
|
||||
</p>
|
||||
</li>
|
||||
</ul>
|
||||
</section>
|
||||
|
||||
</body>
|
||||
</html>
|
||||
|
||||
BIN
frontend/src/scads-logo.png
Normal file
|
After Width: | Height: | Size: 20 KiB |
BIN
frontend/src/secai-logo.png
Normal file
|
After Width: | Height: | Size: 45 KiB |
19
frontend/src/test-data.ts
Normal file
@ -0,0 +1,19 @@
|
||||
const testData = [
|
||||
{ label: 'BOT', lo: 0, hi: 0 },
|
||||
{ label: 'TOP', lo: 1, hi: 1 },
|
||||
{ label: 'Var8', lo: 0, hi: 1 },
|
||||
{ label: 'Var7', lo: 1, hi: 0 },
|
||||
{ label: 'Var0', lo: 3, hi: 1 },
|
||||
{ label: 'Var9', lo: 0, hi: 1 },
|
||||
{ label: 'Var8', lo: 5, hi: 0 },
|
||||
{ label: 'Var0', lo: 6, hi: 5 },
|
||||
{ label: 'Var1', lo: 0, hi: 1 },
|
||||
{ label: 'Var0', lo: 1, hi: 0 },
|
||||
{ label: 'Var9', lo: 1, hi: 0 },
|
||||
{ label: 'Var8', lo: 0, hi: 10 },
|
||||
{ label: 'Var0', lo: 5, hi: 0 },
|
||||
{ label: 'Var8', lo: 1, hi: 0 },
|
||||
{ label: 'Var5', lo: 13, hi: 0 },
|
||||
];
|
||||
|
||||
export default testData;
|
||||
BIN
frontend/src/tud-logo.png
Normal file
|
After Width: | Height: | Size: 20 KiB |
103
frontend/tsconfig.json
Normal file
@ -0,0 +1,103 @@
|
||||
{
|
||||
"compilerOptions": {
|
||||
/* Visit https://aka.ms/tsconfig to read more about this file */
|
||||
|
||||
/* Projects */
|
||||
// "incremental": true, /* Save .tsbuildinfo files to allow for incremental compilation of projects. */
|
||||
// "composite": true, /* Enable constraints that allow a TypeScript project to be used with project references. */
|
||||
// "tsBuildInfoFile": "./.tsbuildinfo", /* Specify the path to .tsbuildinfo incremental compilation file. */
|
||||
// "disableSourceOfProjectReferenceRedirect": true, /* Disable preferring source files instead of declaration files when referencing composite projects. */
|
||||
// "disableSolutionSearching": true, /* Opt a project out of multi-project reference checking when editing. */
|
||||
// "disableReferencedProjectLoad": true, /* Reduce the number of projects loaded automatically by TypeScript. */
|
||||
|
||||
/* Language and Environment */
|
||||
"target": "es2016", /* Set the JavaScript language version for emitted JavaScript and include compatible library declarations. */
|
||||
// "lib": [], /* Specify a set of bundled library declaration files that describe the target runtime environment. */
|
||||
"jsx": "preserve", /* Specify what JSX code is generated. */
|
||||
// "experimentalDecorators": true, /* Enable experimental support for TC39 stage 2 draft decorators. */
|
||||
// "emitDecoratorMetadata": true, /* Emit design-type metadata for decorated declarations in source files. */
|
||||
// "jsxFactory": "", /* Specify the JSX factory function used when targeting React JSX emit, e.g. 'React.createElement' or 'h'. */
|
||||
// "jsxFragmentFactory": "", /* Specify the JSX Fragment reference used for fragments when targeting React JSX emit e.g. 'React.Fragment' or 'Fragment'. */
|
||||
// "jsxImportSource": "", /* Specify module specifier used to import the JSX factory functions when using 'jsx: react-jsx*'. */
|
||||
// "reactNamespace": "", /* Specify the object invoked for 'createElement'. This only applies when targeting 'react' JSX emit. */
|
||||
// "noLib": true, /* Disable including any library files, including the default lib.d.ts. */
|
||||
// "useDefineForClassFields": true, /* Emit ECMAScript-standard-compliant class fields. */
|
||||
// "moduleDetection": "auto", /* Control what method is used to detect module-format JS files. */
|
||||
|
||||
/* Modules */
|
||||
"module": "esnext", /* Specify what module code is generated. */
|
||||
// "rootDir": "./", /* Specify the root folder within your source files. */
|
||||
"moduleResolution": "node", /* Specify how TypeScript looks up a file from a given module specifier. */
|
||||
// "baseUrl": "./", /* Specify the base directory to resolve non-relative module names. */
|
||||
// "paths": {}, /* Specify a set of entries that re-map imports to additional lookup locations. */
|
||||
// "rootDirs": [], /* Allow multiple folders to be treated as one when resolving modules. */
|
||||
// "typeRoots": [], /* Specify multiple folders that act like './node_modules/@types'. */
|
||||
// "types": [], /* Specify type package names to be included without being referenced in a source file. */
|
||||
// "allowUmdGlobalAccess": true, /* Allow accessing UMD globals from modules. */
|
||||
// "moduleSuffixes": [], /* List of file name suffixes to search when resolving a module. */
|
||||
// "resolveJsonModule": true, /* Enable importing .json files. */
|
||||
// "noResolve": true, /* Disallow 'import's, 'require's or '<reference>'s from expanding the number of files TypeScript should add to a project. */
|
||||
|
||||
/* JavaScript Support */
|
||||
// "allowJs": true, /* Allow JavaScript files to be a part of your program. Use the 'checkJS' option to get errors from these files. */
|
||||
// "checkJs": true, /* Enable error reporting in type-checked JavaScript files. */
|
||||
// "maxNodeModuleJsDepth": 1, /* Specify the maximum folder depth used for checking JavaScript files from 'node_modules'. Only applicable with 'allowJs'. */
|
||||
|
||||
/* Emit */
|
||||
// "declaration": true, /* Generate .d.ts files from TypeScript and JavaScript files in your project. */
|
||||
// "declarationMap": true, /* Create sourcemaps for d.ts files. */
|
||||
// "emitDeclarationOnly": true, /* Only output d.ts files and not JavaScript files. */
|
||||
// "sourceMap": true, /* Create source map files for emitted JavaScript files. */
|
||||
// "outFile": "./", /* Specify a file that bundles all outputs into one JavaScript file. If 'declaration' is true, also designates a file that bundles all .d.ts output. */
|
||||
// "outDir": "./", /* Specify an output folder for all emitted files. */
|
||||
// "removeComments": true, /* Disable emitting comments. */
|
||||
// "noEmit": true, /* Disable emitting files from a compilation. */
|
||||
// "importHelpers": true, /* Allow importing helper functions from tslib once per project, instead of including them per-file. */
|
||||
// "importsNotUsedAsValues": "remove", /* Specify emit/checking behavior for imports that are only used for types. */
|
||||
// "downlevelIteration": true, /* Emit more compliant, but verbose and less performant JavaScript for iteration. */
|
||||
// "sourceRoot": "", /* Specify the root path for debuggers to find the reference source code. */
|
||||
// "mapRoot": "", /* Specify the location where debugger should locate map files instead of generated locations. */
|
||||
// "inlineSourceMap": true, /* Include sourcemap files inside the emitted JavaScript. */
|
||||
// "inlineSources": true, /* Include source code in the sourcemaps inside the emitted JavaScript. */
|
||||
// "emitBOM": true, /* Emit a UTF-8 Byte Order Mark (BOM) in the beginning of output files. */
|
||||
// "newLine": "crlf", /* Set the newline character for emitting files. */
|
||||
// "stripInternal": true, /* Disable emitting declarations that have '@internal' in their JSDoc comments. */
|
||||
// "noEmitHelpers": true, /* Disable generating custom helper functions like '__extends' in compiled output. */
|
||||
// "noEmitOnError": true, /* Disable emitting files if any type checking errors are reported. */
|
||||
// "preserveConstEnums": true, /* Disable erasing 'const enum' declarations in generated code. */
|
||||
// "declarationDir": "./", /* Specify the output directory for generated declaration files. */
|
||||
// "preserveValueImports": true, /* Preserve unused imported values in the JavaScript output that would otherwise be removed. */
|
||||
|
||||
/* Interop Constraints */
|
||||
// "isolatedModules": true, /* Ensure that each file can be safely transpiled without relying on other imports. */
|
||||
// "allowSyntheticDefaultImports": true, /* Allow 'import x from y' when a module doesn't have a default export. */
|
||||
"esModuleInterop": true, /* Emit additional JavaScript to ease support for importing CommonJS modules. This enables 'allowSyntheticDefaultImports' for type compatibility. */
|
||||
// "preserveSymlinks": true, /* Disable resolving symlinks to their realpath. This correlates to the same flag in node. */
|
||||
"forceConsistentCasingInFileNames": true, /* Ensure that casing is correct in imports. */
|
||||
|
||||
/* Type Checking */
|
||||
"strict": true, /* Enable all strict type-checking options. */
|
||||
// "noImplicitAny": true, /* Enable error reporting for expressions and declarations with an implied 'any' type. */
|
||||
// "strictNullChecks": true, /* When type checking, take into account 'null' and 'undefined'. */
|
||||
// "strictFunctionTypes": true, /* When assigning functions, check to ensure parameters and the return values are subtype-compatible. */
|
||||
// "strictBindCallApply": true, /* Check that the arguments for 'bind', 'call', and 'apply' methods match the original function. */
|
||||
// "strictPropertyInitialization": true, /* Check for class properties that are declared but not set in the constructor. */
|
||||
// "noImplicitThis": true, /* Enable error reporting when 'this' is given the type 'any'. */
|
||||
// "useUnknownInCatchVariables": true, /* Default catch clause variables as 'unknown' instead of 'any'. */
|
||||
// "alwaysStrict": true, /* Ensure 'use strict' is always emitted. */
|
||||
// "noUnusedLocals": true, /* Enable error reporting when local variables aren't read. */
|
||||
// "noUnusedParameters": true, /* Raise an error when a function parameter isn't read. */
|
||||
// "exactOptionalPropertyTypes": true, /* Interpret optional property types as written, rather than adding 'undefined'. */
|
||||
// "noImplicitReturns": true, /* Enable error reporting for codepaths that do not explicitly return in a function. */
|
||||
// "noFallthroughCasesInSwitch": true, /* Enable error reporting for fallthrough cases in switch statements. */
|
||||
// "noUncheckedIndexedAccess": true, /* Add 'undefined' to a type when accessed using an index. */
|
||||
// "noImplicitOverride": true, /* Ensure overriding members in derived classes are marked with an override modifier. */
|
||||
// "noPropertyAccessFromIndexSignature": true, /* Enforces using indexed accessors for keys declared using an indexed type. */
|
||||
// "allowUnusedLabels": true, /* Disable error reporting for unused labels. */
|
||||
// "allowUnreachableCode": true, /* Disable error reporting for unreachable code. */
|
||||
|
||||
/* Completeness */
|
||||
// "skipDefaultLibCheck": true, /* Skip type checking .d.ts files that are included with TypeScript. */
|
||||
"skipLibCheck": true /* Skip type checking all .d.ts files. */
|
||||
}
|
||||
}
|
||||
4157
frontend/yarn.lock
Normal file
@ -30,9 +30,12 @@ use self::heuristics::Heuristic;
|
||||
///
|
||||
/// Please note that due to the nature of the underlying reduced and ordered Bdd the concept of a [`Term`][crate::datatypes::Term] represents one (sub) formula as well as truth-values.
|
||||
pub struct Adf {
|
||||
ordering: VarContainer,
|
||||
bdd: Bdd,
|
||||
ac: Vec<Term>,
|
||||
/// The ordering or the variables in the ADF including a dictionary for the statements
|
||||
pub ordering: VarContainer,
|
||||
/// The underlying binary decision diagram that respresents the ADF
|
||||
pub bdd: Bdd,
|
||||
/// Acceptance Conditions for the ADF
|
||||
pub ac: Vec<Term>,
|
||||
#[serde(skip, default = "Adf::default_rng")]
|
||||
rng: RefCell<StdRng>,
|
||||
}
|
||||
@ -48,6 +51,17 @@ impl Default for Adf {
|
||||
}
|
||||
}
|
||||
|
||||
impl From<(VarContainer, Bdd, Vec<Term>)> for Adf {
|
||||
fn from(source: (VarContainer, Bdd, Vec<Term>)) -> Self {
|
||||
Self {
|
||||
ordering: source.0,
|
||||
bdd: source.1,
|
||||
ac: source.2,
|
||||
rng: Self::default_rng(),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl Adf {
|
||||
/// Instantiates a new ADF, based on the [parser-data][crate::parser::AdfParser].
|
||||
pub fn from_parser(parser: &AdfParser) -> Self {
|
||||
@ -58,7 +72,7 @@ impl Adf {
|
||||
ac: vec![Term(0); parser.dict_size()],
|
||||
rng: Adf::default_rng(),
|
||||
};
|
||||
(0..parser.dict_size()).into_iter().for_each(|value| {
|
||||
(0..parser.dict_size()).for_each(|value| {
|
||||
log::trace!("adding variable {}", Var(value));
|
||||
result.bdd.variable(Var(value));
|
||||
});
|
||||
|
||||
@ -26,7 +26,8 @@ impl Default for VarContainer {
|
||||
}
|
||||
|
||||
impl VarContainer {
|
||||
pub(crate) fn from_parser(
|
||||
/// Create [`VarContainer`] from its components
|
||||
pub fn from_parser(
|
||||
names: Arc<RwLock<Vec<String>>>,
|
||||
mapping: Arc<RwLock<HashMap<String, usize>>>,
|
||||
) -> VarContainer {
|
||||
@ -51,11 +52,13 @@ impl VarContainer {
|
||||
.and_then(|name| name.get(var.value()).cloned())
|
||||
}
|
||||
|
||||
pub(crate) fn names(&self) -> Arc<RwLock<Vec<String>>> {
|
||||
/// Return ordered names from [`VarContainer`]
|
||||
pub fn names(&self) -> Arc<RwLock<Vec<String>>> {
|
||||
Arc::clone(&self.names)
|
||||
}
|
||||
|
||||
pub(crate) fn mappings(&self) -> Arc<RwLock<HashMap<String, usize>>> {
|
||||
/// Return map from names to indices in [`VarContainer`]
|
||||
pub fn mappings(&self) -> Arc<RwLock<HashMap<String, usize>>> {
|
||||
Arc::clone(&self.mapping)
|
||||
}
|
||||
|
||||
|
||||
140
lib/src/lib.rs
@ -198,6 +198,146 @@ while let Ok(result) = r.recv() {
|
||||
solving.join().unwrap();
|
||||
|
||||
```
|
||||
|
||||
### Serialize and Deserialize custom datastructures representing an [`adf::Adf`]
|
||||
|
||||
The Web Application <https://adf-bdd.dev> uses custom datastructures that are stored in a mongodb which inspired this example.
|
||||
|
||||
```rust
|
||||
use std::sync::{Arc, RwLock};
|
||||
use serde::{Deserialize, Serialize};
|
||||
use std::collections::HashMap;
|
||||
use adf_bdd::datatypes::adf::VarContainer;
|
||||
use adf_bdd::datatypes::{BddNode, Term, Var};
|
||||
use adf_bdd::obdd::Bdd;
|
||||
use adf_bdd::parser::AdfParser;
|
||||
use adf_bdd::adf::Adf;
|
||||
|
||||
// Custom Datastructures for (De-)Serialization
|
||||
|
||||
# #[derive(PartialEq, Debug)]
|
||||
#[derive(Deserialize, Serialize)]
|
||||
struct MyCustomVarContainer {
|
||||
names: Vec<String>,
|
||||
mapping: HashMap<String, String>,
|
||||
}
|
||||
|
||||
impl From<VarContainer> for MyCustomVarContainer {
|
||||
fn from(source: VarContainer) -> Self {
|
||||
Self {
|
||||
names: source.names().read().unwrap().clone(),
|
||||
mapping: source
|
||||
.mappings()
|
||||
.read()
|
||||
.unwrap()
|
||||
.iter()
|
||||
.map(|(k, v)| (k.clone(), v.to_string()))
|
||||
.collect(),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl From<MyCustomVarContainer> for VarContainer {
|
||||
fn from(source: MyCustomVarContainer) -> Self {
|
||||
Self::from_parser(
|
||||
Arc::new(RwLock::new(source.names)),
|
||||
Arc::new(RwLock::new(
|
||||
source
|
||||
.mapping
|
||||
.into_iter()
|
||||
.map(|(k, v)| (k, v.parse().unwrap()))
|
||||
.collect(),
|
||||
)),
|
||||
)
|
||||
}
|
||||
}
|
||||
|
||||
# #[derive(PartialEq, Debug)]
|
||||
#[derive(Deserialize, Serialize)]
|
||||
struct MyCustomBddNode {
|
||||
var: String,
|
||||
lo: String,
|
||||
hi: String,
|
||||
}
|
||||
|
||||
impl From<BddNode> for MyCustomBddNode {
|
||||
fn from(source: BddNode) -> Self {
|
||||
Self {
|
||||
var: source.var().0.to_string(),
|
||||
lo: source.lo().0.to_string(),
|
||||
hi: source.hi().0.to_string(),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl From<MyCustomBddNode> for BddNode {
|
||||
fn from(source: MyCustomBddNode) -> Self {
|
||||
Self::new(
|
||||
Var(source.var.parse().unwrap()),
|
||||
Term(source.lo.parse().unwrap()),
|
||||
Term(source.hi.parse().unwrap()),
|
||||
)
|
||||
}
|
||||
}
|
||||
|
||||
# #[derive(PartialEq, Debug)]
|
||||
#[derive(Deserialize, Serialize)]
|
||||
struct MyCustomAdf {
|
||||
ordering: MyCustomVarContainer,
|
||||
bdd: Vec<MyCustomBddNode>,
|
||||
ac: Vec<String>,
|
||||
}
|
||||
|
||||
impl From<Adf> for MyCustomAdf {
|
||||
fn from(source: Adf) -> Self {
|
||||
Self {
|
||||
ordering: source.ordering.into(),
|
||||
bdd: source.bdd.nodes.into_iter().map(Into::into).collect(),
|
||||
ac: source.ac.into_iter().map(|t| t.0.to_string()).collect(),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl From<MyCustomAdf> for Adf {
|
||||
fn from(source: MyCustomAdf) -> Self {
|
||||
let bdd = Bdd::from(source.bdd.into_iter().map(Into::into).collect::<Vec<BddNode>>());
|
||||
|
||||
Adf::from((
|
||||
source.ordering.into(),
|
||||
bdd,
|
||||
source
|
||||
.ac
|
||||
.into_iter()
|
||||
.map(|t| Term(t.parse().unwrap()))
|
||||
.collect(),
|
||||
))
|
||||
}
|
||||
}
|
||||
|
||||
// use the above example as input
|
||||
let input = "s(a).s(b).s(c).s(d).ac(a,c(v)).ac(b,or(a,b)).ac(c,neg(b)).ac(d,d).";
|
||||
let parser = AdfParser::default();
|
||||
parser.parse()(&input).unwrap();
|
||||
|
||||
// create Adf
|
||||
let adf = Adf::from_parser(&parser);
|
||||
|
||||
// cast into custom struct
|
||||
let my_custom_adf: MyCustomAdf = adf.into();
|
||||
|
||||
// stringify to json
|
||||
let json: String = serde_json::to_string(&my_custom_adf).unwrap();
|
||||
|
||||
// parse json
|
||||
let parsed_custom_adf: MyCustomAdf = serde_json::from_str(&json).unwrap();
|
||||
|
||||
// cast into lib struct that resembles the original Adf
|
||||
let parsed_adf: Adf = parsed_custom_adf.into();
|
||||
|
||||
# let my_custom_adf2: MyCustomAdf = parsed_adf.into();
|
||||
# assert_eq!(my_custom_adf, my_custom_adf2);
|
||||
```
|
||||
|
||||
*/
|
||||
#![deny(
|
||||
missing_debug_implementations,
|
||||
|
||||
@ -82,7 +82,7 @@ impl NoGood {
|
||||
visit.then_some(result)
|
||||
}
|
||||
|
||||
/// Creates an updated [Vec<Term>], based on the given [&[Term]] and the [NoGood].
|
||||
/// Creates an updated [`Vec<Term>`], based on the given [&[Term]] and the [NoGood].
|
||||
/// The parameter _update_ is set to [`true`] if there has been an update and to [`false`] otherwise
|
||||
pub fn update_term_vec(&self, term_vec: &[Term], update: &mut bool) -> Vec<Term> {
|
||||
*update = false;
|
||||
|
||||
@ -13,7 +13,8 @@ use std::{cell::RefCell, cmp::min, collections::HashMap, fmt::Display};
|
||||
/// Each roBDD is identified by its corresponding [`Term`], which implicitly identifies the root node of a roBDD.
|
||||
#[derive(Debug, Serialize, Deserialize)]
|
||||
pub struct Bdd {
|
||||
pub(crate) nodes: Vec<BddNode>,
|
||||
/// The nodes of the [`Bdd`] with their edges
|
||||
pub nodes: Vec<BddNode>,
|
||||
#[cfg(feature = "variablelist")]
|
||||
#[serde(skip)]
|
||||
var_deps: Vec<HashSet<Var>>,
|
||||
@ -49,6 +50,18 @@ impl Default for Bdd {
|
||||
}
|
||||
}
|
||||
|
||||
impl From<Vec<BddNode>> for Bdd {
|
||||
fn from(nodes: Vec<BddNode>) -> Self {
|
||||
let mut bdd = Self::new();
|
||||
|
||||
for node in nodes {
|
||||
bdd.node(node.var(), node.lo(), node.hi());
|
||||
}
|
||||
|
||||
bdd
|
||||
}
|
||||
}
|
||||
|
||||
impl Bdd {
|
||||
/// Instantiate a new roBDD structure.
|
||||
/// Constants for the [`⊤`][crate::datatypes::Term::TOP] and [`⊥`][crate::datatypes::Term::BOT] concepts are prepared in that step too.
|
||||
@ -556,7 +569,7 @@ impl Bdd {
|
||||
|
||||
/// Counts how often another roBDD uses a [variable][crate::datatypes::Var], which occurs in this roBDD.
|
||||
pub fn active_var_impact(&self, var: Var, termlist: &[Term]) -> usize {
|
||||
(0..termlist.len()).into_iter().fold(0usize, |acc, idx| {
|
||||
(0..termlist.len()).fold(0usize, |acc, idx| {
|
||||
if self
|
||||
.var_dependencies(termlist[var.value()])
|
||||
.contains(&Var(idx))
|
||||
|
||||
32
server/Cargo.toml
Normal file
@ -0,0 +1,32 @@
|
||||
[package]
|
||||
name = "adf-bdd-server"
|
||||
version = "0.3.0"
|
||||
authors = ["Lukas Gerlach <lukas.gerlach@tu-dresden.de>"]
|
||||
edition = "2021"
|
||||
homepage = "https://ellmau.github.io/adf-obdd"
|
||||
repository = "https://github.com/ellmau/adf-obdd"
|
||||
license = "MIT"
|
||||
exclude = ["res/", "./flake*", "*.nix", ".envrc", "_config.yml", "tarpaulin-report.*", "*~"]
|
||||
description = "Offer Solving ADFs as a service"
|
||||
|
||||
# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html
|
||||
|
||||
[dependencies]
|
||||
adf_bdd = { version="0.3.1", path="../lib", features = ["frontend"] }
|
||||
actix-web = "4"
|
||||
actix-cors = "0.6"
|
||||
actix-files = "0.6"
|
||||
env_logger = "0.9"
|
||||
log = "0.4"
|
||||
serde = "1"
|
||||
mongodb = "2.4.0"
|
||||
actix-identity = "0.5.2"
|
||||
argon2 = "0.5.0"
|
||||
actix-session = { version="0.7.2", features = ["cookie-session"] }
|
||||
names = "0.14.0"
|
||||
futures-util = "0.3.28"
|
||||
actix-multipart = "0.6.0"
|
||||
|
||||
[features]
|
||||
cors_for_local_development = []
|
||||
mock_long_computations = []
|
||||
13
server/README.md
Normal file
@ -0,0 +1,13 @@
|
||||
# Backend for Webservice
|
||||
|
||||
This directory contains the backend for <https://adf-bdd.dev> built using actix.rs.
|
||||
|
||||
## Usage
|
||||
|
||||
For local development run:
|
||||
|
||||
- `docker compose up` to run a MongoDB including a web admin interface
|
||||
- `MONGODB_URI=mongodb://root:example@localhost:27017/ cargo run -F cors_for_local_development -F mock_long_computations` to start the server, connecting it to the MongoDB and allowing CORS from the frontend (running on a separate development server)
|
||||
|
||||
The server listens on `localhost:8080`.
|
||||
The feature flag `-F mock_long_computations` is optional and just mimics longer computation times by using `std::thread::sleep`. This can be helpful to check how the frontend will behave in such cases.
|
||||
24
server/docker-compose.yml
Normal file
@ -0,0 +1,24 @@
|
||||
version: '3.1'
|
||||
|
||||
services:
|
||||
|
||||
mongo:
|
||||
image: mongo
|
||||
restart: always
|
||||
ports:
|
||||
- 27017:27017
|
||||
environment:
|
||||
MONGO_INITDB_ROOT_USERNAME: root
|
||||
MONGO_INITDB_ROOT_PASSWORD: example
|
||||
volumes:
|
||||
- ./mongodb-data:/data/db
|
||||
|
||||
mongo-express:
|
||||
image: mongo-express
|
||||
restart: always
|
||||
ports:
|
||||
- 8081:8081
|
||||
environment:
|
||||
ME_CONFIG_MONGODB_ADMINUSERNAME: root
|
||||
ME_CONFIG_MONGODB_ADMINPASSWORD: example
|
||||
ME_CONFIG_MONGODB_URL: mongodb://root:example@mongo:27017/
|
||||
747
server/src/adf.rs
Normal file
@ -0,0 +1,747 @@
|
||||
use std::collections::{HashMap, HashSet};
|
||||
use std::sync::{Arc, RwLock};
|
||||
#[cfg(feature = "mock_long_computations")]
|
||||
use std::time::Duration;
|
||||
|
||||
use actix_identity::Identity;
|
||||
use actix_multipart::form::{tempfile::TempFile, text::Text, MultipartForm};
|
||||
use actix_web::rt::spawn;
|
||||
use actix_web::rt::task::spawn_blocking;
|
||||
use actix_web::rt::time::timeout;
|
||||
use actix_web::{delete, get, post, put, web, HttpMessage, HttpRequest, HttpResponse, Responder};
|
||||
use adf_bdd::datatypes::adf::VarContainer;
|
||||
use adf_bdd::datatypes::{BddNode, Term, Var};
|
||||
use futures_util::{FutureExt, TryStreamExt};
|
||||
use mongodb::bson::doc;
|
||||
use mongodb::bson::{to_bson, Bson};
|
||||
use mongodb::results::DeleteResult;
|
||||
use names::{Generator, Name};
|
||||
use serde::{Deserialize, Serialize};
|
||||
|
||||
use adf_bdd::adf::Adf;
|
||||
use adf_bdd::adfbiodivine::Adf as BdAdf;
|
||||
use adf_bdd::obdd::Bdd;
|
||||
use adf_bdd::parser::AdfParser;
|
||||
|
||||
use crate::config::{AppState, RunningInfo, Task, ADF_COLL, COMPUTE_TIME, DB_NAME, USER_COLL};
|
||||
use crate::user::{username_exists, User};
|
||||
|
||||
use crate::double_labeled_graph::DoubleLabeledGraph;
|
||||
|
||||
type Ac = Vec<Term>;
|
||||
type AcDb = Vec<String>;
|
||||
|
||||
#[derive(Copy, Clone, Debug, Deserialize, Serialize)]
|
||||
pub(crate) enum Parsing {
|
||||
Naive,
|
||||
Hybrid,
|
||||
}
|
||||
|
||||
#[derive(Copy, Clone, Debug, PartialEq, Eq, Hash, Deserialize, Serialize)]
|
||||
pub(crate) enum Strategy {
|
||||
Ground,
|
||||
Complete,
|
||||
Stable,
|
||||
StableCountingA,
|
||||
StableCountingB,
|
||||
StableNogood,
|
||||
}
|
||||
|
||||
#[derive(Clone, Deserialize, Serialize)]
|
||||
pub(crate) struct AcAndGraph {
|
||||
pub(crate) ac: AcDb,
|
||||
pub(crate) graph: DoubleLabeledGraph,
|
||||
}
|
||||
|
||||
impl From<AcAndGraph> for Bson {
|
||||
fn from(source: AcAndGraph) -> Self {
|
||||
to_bson(&source).expect("Serialization should work")
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Clone, Default, Deserialize, Serialize)]
|
||||
#[serde(tag = "type", content = "content")]
|
||||
pub(crate) enum OptionWithError<T> {
|
||||
Some(T),
|
||||
Error(String),
|
||||
#[default]
|
||||
None,
|
||||
}
|
||||
|
||||
impl<T> OptionWithError<T> {
|
||||
fn is_some(&self) -> bool {
|
||||
matches!(self, Self::Some(_))
|
||||
}
|
||||
}
|
||||
|
||||
impl<T: Serialize> From<OptionWithError<T>> for Bson {
|
||||
fn from(source: OptionWithError<T>) -> Self {
|
||||
to_bson(&source).expect("Serialization should work")
|
||||
}
|
||||
}
|
||||
|
||||
type AcsAndGraphsOpt = OptionWithError<Vec<AcAndGraph>>;
|
||||
|
||||
#[derive(Default, Deserialize, Serialize)]
|
||||
pub(crate) struct AcsPerStrategy {
|
||||
pub(crate) parse_only: AcsAndGraphsOpt,
|
||||
pub(crate) ground: AcsAndGraphsOpt,
|
||||
pub(crate) complete: AcsAndGraphsOpt,
|
||||
pub(crate) stable: AcsAndGraphsOpt,
|
||||
pub(crate) stable_counting_a: AcsAndGraphsOpt,
|
||||
pub(crate) stable_counting_b: AcsAndGraphsOpt,
|
||||
pub(crate) stable_nogood: AcsAndGraphsOpt,
|
||||
}
|
||||
|
||||
#[derive(Clone, Deserialize, Serialize)]
|
||||
pub(crate) struct VarContainerDb {
|
||||
names: Vec<String>,
|
||||
mapping: HashMap<String, String>,
|
||||
}
|
||||
|
||||
impl From<VarContainer> for VarContainerDb {
|
||||
fn from(source: VarContainer) -> Self {
|
||||
Self {
|
||||
names: source.names().read().unwrap().clone(),
|
||||
mapping: source
|
||||
.mappings()
|
||||
.read()
|
||||
.unwrap()
|
||||
.iter()
|
||||
.map(|(k, v)| (k.clone(), v.to_string()))
|
||||
.collect(),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl From<VarContainerDb> for VarContainer {
|
||||
fn from(source: VarContainerDb) -> Self {
|
||||
Self::from_parser(
|
||||
Arc::new(RwLock::new(source.names)),
|
||||
Arc::new(RwLock::new(
|
||||
source
|
||||
.mapping
|
||||
.into_iter()
|
||||
.map(|(k, v)| (k, v.parse().unwrap()))
|
||||
.collect(),
|
||||
)),
|
||||
)
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Clone, Deserialize, Serialize)]
|
||||
pub(crate) struct BddNodeDb {
|
||||
var: String,
|
||||
lo: String,
|
||||
hi: String,
|
||||
}
|
||||
|
||||
impl From<BddNode> for BddNodeDb {
|
||||
fn from(source: BddNode) -> Self {
|
||||
Self {
|
||||
var: source.var().0.to_string(),
|
||||
lo: source.lo().0.to_string(),
|
||||
hi: source.hi().0.to_string(),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl From<BddNodeDb> for BddNode {
|
||||
fn from(source: BddNodeDb) -> Self {
|
||||
Self::new(
|
||||
Var(source.var.parse().unwrap()),
|
||||
Term(source.lo.parse().unwrap()),
|
||||
Term(source.hi.parse().unwrap()),
|
||||
)
|
||||
}
|
||||
}
|
||||
|
||||
type SimplifiedBdd = Vec<BddNodeDb>;
|
||||
|
||||
#[derive(Clone, Deserialize, Serialize)]
|
||||
pub(crate) struct SimplifiedAdf {
|
||||
pub(crate) ordering: VarContainerDb,
|
||||
pub(crate) bdd: SimplifiedBdd,
|
||||
pub(crate) ac: AcDb,
|
||||
}
|
||||
|
||||
impl From<Adf> for SimplifiedAdf {
|
||||
fn from(source: Adf) -> Self {
|
||||
Self {
|
||||
ordering: source.ordering.into(),
|
||||
bdd: source.bdd.nodes.into_iter().map(Into::into).collect(),
|
||||
ac: source.ac.into_iter().map(|t| t.0.to_string()).collect(),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl From<SimplifiedAdf> for Adf {
|
||||
fn from(source: SimplifiedAdf) -> Self {
|
||||
let bdd = Bdd::from(
|
||||
source
|
||||
.bdd
|
||||
.into_iter()
|
||||
.map(Into::into)
|
||||
.collect::<Vec<BddNode>>(),
|
||||
);
|
||||
|
||||
Adf::from((
|
||||
source.ordering.into(),
|
||||
bdd,
|
||||
source
|
||||
.ac
|
||||
.into_iter()
|
||||
.map(|t| Term(t.parse().unwrap()))
|
||||
.collect(),
|
||||
))
|
||||
}
|
||||
}
|
||||
|
||||
type SimplifiedAdfOpt = OptionWithError<SimplifiedAdf>;
|
||||
|
||||
#[derive(Deserialize, Serialize)]
|
||||
pub(crate) struct AdfProblem {
|
||||
pub(crate) name: String,
|
||||
pub(crate) username: String,
|
||||
pub(crate) code: String,
|
||||
pub(crate) parsing_used: Parsing,
|
||||
pub(crate) adf: SimplifiedAdfOpt,
|
||||
pub(crate) acs_per_strategy: AcsPerStrategy,
|
||||
}
|
||||
|
||||
#[derive(MultipartForm)]
|
||||
struct AddAdfProblemBodyMultipart {
|
||||
name: Text<String>,
|
||||
code: Option<Text<String>>, // Either Code or File is set
|
||||
file: Option<TempFile>, // Either Code or File is set
|
||||
parsing: Text<Parsing>,
|
||||
}
|
||||
|
||||
#[derive(Clone)]
|
||||
struct AddAdfProblemBodyPlain {
|
||||
name: String,
|
||||
code: String,
|
||||
parsing: Parsing,
|
||||
}
|
||||
|
||||
impl TryFrom<AddAdfProblemBodyMultipart> for AddAdfProblemBodyPlain {
|
||||
type Error = &'static str;
|
||||
|
||||
fn try_from(source: AddAdfProblemBodyMultipart) -> Result<Self, Self::Error> {
|
||||
Ok(Self {
|
||||
name: source.name.into_inner(),
|
||||
code: source
|
||||
.file
|
||||
.map(|f| std::io::read_to_string(f.file).expect("TempFile should be readable"))
|
||||
.or_else(|| source.code.map(|c| c.into_inner()))
|
||||
.and_then(|code| (!code.is_empty()).then_some(code))
|
||||
.ok_or("Either a file or the code has to be provided.")?,
|
||||
parsing: source.parsing.into_inner(),
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
async fn adf_problem_exists(
|
||||
adf_coll: &mongodb::Collection<AdfProblem>,
|
||||
name: &str,
|
||||
username: &str,
|
||||
) -> bool {
|
||||
adf_coll
|
||||
.find_one(doc! { "name": name, "username": username }, None)
|
||||
.await
|
||||
.ok()
|
||||
.flatten()
|
||||
.is_some()
|
||||
}
|
||||
|
||||
#[derive(Serialize)]
|
||||
struct AdfProblemInfo {
|
||||
name: String,
|
||||
code: String,
|
||||
parsing_used: Parsing,
|
||||
acs_per_strategy: AcsPerStrategy,
|
||||
running_tasks: Vec<Task>,
|
||||
}
|
||||
|
||||
impl AdfProblemInfo {
|
||||
fn from_adf_prob_and_tasks(adf: AdfProblem, tasks: &HashSet<RunningInfo>) -> Self {
|
||||
AdfProblemInfo {
|
||||
name: adf.name.clone(),
|
||||
code: adf.code,
|
||||
parsing_used: adf.parsing_used,
|
||||
acs_per_strategy: adf.acs_per_strategy,
|
||||
running_tasks: tasks
|
||||
.iter()
|
||||
.filter_map(|t| {
|
||||
(t.adf_name == adf.name && t.username == adf.username).then_some(t.task)
|
||||
})
|
||||
.collect(),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
#[post("/add")]
|
||||
async fn add_adf_problem(
|
||||
req: HttpRequest,
|
||||
app_state: web::Data<AppState>,
|
||||
identity: Option<Identity>,
|
||||
req_body: MultipartForm<AddAdfProblemBodyMultipart>,
|
||||
) -> impl Responder {
|
||||
let adf_problem_input: AddAdfProblemBodyPlain = match req_body.into_inner().try_into() {
|
||||
Ok(input) => input,
|
||||
Err(err) => return HttpResponse::BadRequest().body(err),
|
||||
};
|
||||
let adf_coll: mongodb::Collection<AdfProblem> = app_state
|
||||
.mongodb_client
|
||||
.database(DB_NAME)
|
||||
.collection(ADF_COLL);
|
||||
let user_coll: mongodb::Collection<User> = app_state
|
||||
.mongodb_client
|
||||
.database(DB_NAME)
|
||||
.collection(USER_COLL);
|
||||
|
||||
let username = match identity.map(|id| id.id()) {
|
||||
None => {
|
||||
// Create and log in temporary user
|
||||
let gen = Generator::with_naming(Name::Numbered);
|
||||
let candidates = gen.take(10);
|
||||
|
||||
let mut name: Option<String> = None;
|
||||
for candidate in candidates {
|
||||
if name.is_some() {
|
||||
continue;
|
||||
}
|
||||
|
||||
if !(username_exists(&user_coll, &candidate).await) {
|
||||
name = Some(candidate);
|
||||
}
|
||||
}
|
||||
|
||||
let username = match name {
|
||||
Some(name) => name,
|
||||
None => {
|
||||
return HttpResponse::InternalServerError().body("Could not generate new name.")
|
||||
}
|
||||
};
|
||||
|
||||
match user_coll
|
||||
.insert_one(
|
||||
User {
|
||||
username: username.clone(),
|
||||
password: None,
|
||||
},
|
||||
None,
|
||||
)
|
||||
.await
|
||||
{
|
||||
Ok(_) => (),
|
||||
Err(err) => return HttpResponse::InternalServerError().body(err.to_string()),
|
||||
}
|
||||
|
||||
Identity::login(&req.extensions(), username.clone()).unwrap();
|
||||
|
||||
username
|
||||
}
|
||||
Some(Err(err)) => return HttpResponse::InternalServerError().body(err.to_string()),
|
||||
Some(Ok(username)) => username,
|
||||
};
|
||||
|
||||
let problem_name = if !adf_problem_input.name.is_empty() {
|
||||
if adf_problem_exists(&adf_coll, &adf_problem_input.name, &username).await {
|
||||
return HttpResponse::Conflict()
|
||||
.body("ADF Problem with that name already exists. Please pick another one!");
|
||||
}
|
||||
|
||||
adf_problem_input.name.clone()
|
||||
} else {
|
||||
let gen = Generator::with_naming(Name::Numbered);
|
||||
let candidates = gen.take(10);
|
||||
|
||||
let mut name: Option<String> = None;
|
||||
for candidate in candidates {
|
||||
if name.is_some() {
|
||||
continue;
|
||||
}
|
||||
|
||||
if !(adf_problem_exists(&adf_coll, &candidate, &username).await) {
|
||||
name = Some(candidate);
|
||||
}
|
||||
}
|
||||
|
||||
match name {
|
||||
Some(name) => name,
|
||||
None => {
|
||||
return HttpResponse::InternalServerError().body("Could not generate new name.")
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
let adf_problem: AdfProblem = AdfProblem {
|
||||
name: problem_name.clone(),
|
||||
username: username.clone(),
|
||||
code: adf_problem_input.code.clone(),
|
||||
parsing_used: adf_problem_input.parsing,
|
||||
adf: SimplifiedAdfOpt::None,
|
||||
acs_per_strategy: AcsPerStrategy::default(),
|
||||
};
|
||||
|
||||
let result = adf_coll.insert_one(&adf_problem, None).await;
|
||||
|
||||
if let Err(err) = result {
|
||||
return HttpResponse::InternalServerError()
|
||||
.body(format!("Could not create Database entry. Error: {err}"));
|
||||
}
|
||||
|
||||
let username_clone = username.clone();
|
||||
let problem_name_clone = problem_name.clone();
|
||||
|
||||
let adf_fut = timeout(
|
||||
COMPUTE_TIME,
|
||||
spawn_blocking(move || {
|
||||
let running_info = RunningInfo {
|
||||
username: username_clone,
|
||||
adf_name: problem_name_clone,
|
||||
task: Task::Parse,
|
||||
};
|
||||
|
||||
app_state
|
||||
.currently_running
|
||||
.lock()
|
||||
.unwrap()
|
||||
.insert(running_info.clone());
|
||||
|
||||
#[cfg(feature = "mock_long_computations")]
|
||||
std::thread::sleep(Duration::from_secs(20));
|
||||
|
||||
let parser = AdfParser::default();
|
||||
let parse_result = parser.parse()(&adf_problem_input.code)
|
||||
.map_err(|_| "ADF could not be parsed, double check your input!");
|
||||
|
||||
let result = parse_result.map(|_| {
|
||||
let lib_adf = match adf_problem_input.parsing {
|
||||
Parsing::Naive => Adf::from_parser(&parser),
|
||||
Parsing::Hybrid => {
|
||||
let bd_adf = BdAdf::from_parser(&parser);
|
||||
bd_adf.hybrid_step_opt(false)
|
||||
}
|
||||
};
|
||||
|
||||
let ac_and_graph = AcAndGraph {
|
||||
ac: lib_adf.ac.iter().map(|t| t.0.to_string()).collect(),
|
||||
graph: DoubleLabeledGraph::from_adf_and_ac(&lib_adf, None),
|
||||
};
|
||||
|
||||
(SimplifiedAdf::from(lib_adf), ac_and_graph)
|
||||
});
|
||||
|
||||
app_state
|
||||
.currently_running
|
||||
.lock()
|
||||
.unwrap()
|
||||
.remove(&running_info);
|
||||
|
||||
result
|
||||
}),
|
||||
);
|
||||
|
||||
spawn(adf_fut.then(move |adf_res| async move {
|
||||
let (adf, ac_and_graph): (SimplifiedAdfOpt, AcsAndGraphsOpt) = match adf_res {
|
||||
Err(err) => (
|
||||
SimplifiedAdfOpt::Error(err.to_string()),
|
||||
AcsAndGraphsOpt::Error(err.to_string()),
|
||||
),
|
||||
Ok(Err(err)) => (
|
||||
SimplifiedAdfOpt::Error(err.to_string()),
|
||||
AcsAndGraphsOpt::Error(err.to_string()),
|
||||
),
|
||||
Ok(Ok(Err(err))) => (
|
||||
SimplifiedAdfOpt::Error(err.to_string()),
|
||||
AcsAndGraphsOpt::Error(err.to_string()),
|
||||
),
|
||||
Ok(Ok(Ok((adf, ac_and_graph)))) => (
|
||||
SimplifiedAdfOpt::Some(adf),
|
||||
AcsAndGraphsOpt::Some(vec![ac_and_graph]),
|
||||
),
|
||||
};
|
||||
|
||||
let result = adf_coll
|
||||
.update_one(
|
||||
doc! { "name": problem_name, "username": username },
|
||||
doc! { "$set": { "adf": &adf, "acs_per_strategy.parse_only": &ac_and_graph } },
|
||||
None,
|
||||
)
|
||||
.await;
|
||||
|
||||
if let Err(err) = result {
|
||||
log::error!("{err}");
|
||||
}
|
||||
}));
|
||||
|
||||
HttpResponse::Ok().body("Parsing started...")
|
||||
}
|
||||
|
||||
#[derive(Deserialize)]
|
||||
struct SolveAdfProblemBody {
|
||||
strategy: Strategy,
|
||||
}
|
||||
|
||||
#[put("/{problem_name}/solve")]
|
||||
async fn solve_adf_problem(
|
||||
app_state: web::Data<AppState>,
|
||||
identity: Option<Identity>,
|
||||
path: web::Path<String>,
|
||||
req_body: web::Json<SolveAdfProblemBody>,
|
||||
) -> impl Responder {
|
||||
let problem_name = path.into_inner();
|
||||
let adf_problem_input: SolveAdfProblemBody = req_body.into_inner();
|
||||
let adf_coll: mongodb::Collection<AdfProblem> = app_state
|
||||
.mongodb_client
|
||||
.database(DB_NAME)
|
||||
.collection(ADF_COLL);
|
||||
|
||||
let username = match identity.map(|id| id.id()) {
|
||||
None => {
|
||||
return HttpResponse::Unauthorized().body("You need to login to add an ADF problem.")
|
||||
}
|
||||
Some(Err(err)) => return HttpResponse::InternalServerError().body(err.to_string()),
|
||||
Some(Ok(username)) => username,
|
||||
};
|
||||
|
||||
let adf_problem = match adf_coll
|
||||
.find_one(doc! { "name": &problem_name, "username": &username }, None)
|
||||
.await
|
||||
{
|
||||
Err(err) => return HttpResponse::InternalServerError().body(err.to_string()),
|
||||
Ok(None) => {
|
||||
return HttpResponse::NotFound()
|
||||
.body(format!("ADF problem with name {problem_name} not found."))
|
||||
}
|
||||
Ok(Some(prob)) => prob,
|
||||
};
|
||||
|
||||
let simp_adf: SimplifiedAdf = match adf_problem.adf {
|
||||
SimplifiedAdfOpt::None => {
|
||||
return HttpResponse::BadRequest().body("The ADF problem has not been parsed yet.")
|
||||
}
|
||||
SimplifiedAdfOpt::Error(err) => {
|
||||
return HttpResponse::BadRequest().body(format!(
|
||||
"The ADF problem could not be parsed. Update it and try again. Error: {err}"
|
||||
))
|
||||
}
|
||||
SimplifiedAdfOpt::Some(adf) => adf,
|
||||
};
|
||||
|
||||
let has_been_solved = match adf_problem_input.strategy {
|
||||
Strategy::Complete => adf_problem.acs_per_strategy.complete.is_some(),
|
||||
Strategy::Ground => adf_problem.acs_per_strategy.ground.is_some(),
|
||||
Strategy::Stable => adf_problem.acs_per_strategy.stable.is_some(),
|
||||
Strategy::StableCountingA => adf_problem.acs_per_strategy.stable_counting_a.is_some(),
|
||||
Strategy::StableCountingB => adf_problem.acs_per_strategy.stable_counting_b.is_some(),
|
||||
Strategy::StableNogood => adf_problem.acs_per_strategy.stable_nogood.is_some(),
|
||||
};
|
||||
|
||||
let username_clone = username.clone();
|
||||
let problem_name_clone = problem_name.clone();
|
||||
|
||||
let running_info = RunningInfo {
|
||||
username: username_clone,
|
||||
adf_name: problem_name_clone,
|
||||
task: Task::Solve(adf_problem_input.strategy),
|
||||
};
|
||||
|
||||
// NOTE: we could also return the result here instead of throwing an error but I think the canonical way should just be to call the get endpoint for the problem.
|
||||
if has_been_solved
|
||||
|| app_state
|
||||
.currently_running
|
||||
.lock()
|
||||
.unwrap()
|
||||
.contains(&running_info)
|
||||
{
|
||||
return HttpResponse::Conflict()
|
||||
.body("The ADF problem has already been solved with this strategy. You can just get the solution from the problem data directly.");
|
||||
}
|
||||
|
||||
let acs_and_graphs_fut = timeout(
|
||||
COMPUTE_TIME,
|
||||
spawn_blocking(move || {
|
||||
app_state
|
||||
.currently_running
|
||||
.lock()
|
||||
.unwrap()
|
||||
.insert(running_info.clone());
|
||||
|
||||
#[cfg(feature = "mock_long_computations")]
|
||||
std::thread::sleep(Duration::from_secs(20));
|
||||
|
||||
let mut adf: Adf = simp_adf.into();
|
||||
|
||||
let acs: Vec<Ac> = match adf_problem_input.strategy {
|
||||
Strategy::Complete => adf.complete().collect(),
|
||||
Strategy::Ground => vec![adf.grounded()],
|
||||
Strategy::Stable => adf.stable().collect(),
|
||||
// TODO: INPUT VALIDATION: only allow this for hybrid parsing
|
||||
Strategy::StableCountingA => adf.stable_count_optimisation_heu_a().collect(),
|
||||
// TODO: INPUT VALIDATION: only allow this for hybrid parsing
|
||||
Strategy::StableCountingB => adf.stable_count_optimisation_heu_b().collect(),
|
||||
// TODO: support more than just default heuristics
|
||||
Strategy::StableNogood => adf
|
||||
.stable_nogood(adf_bdd::adf::heuristics::Heuristic::default())
|
||||
.collect(),
|
||||
};
|
||||
|
||||
let acs_and_graphs: Vec<AcAndGraph> = acs
|
||||
.iter()
|
||||
.map(|ac| AcAndGraph {
|
||||
ac: ac.iter().map(|t| t.0.to_string()).collect(),
|
||||
graph: DoubleLabeledGraph::from_adf_and_ac(&adf, Some(ac)),
|
||||
})
|
||||
.collect();
|
||||
|
||||
app_state
|
||||
.currently_running
|
||||
.lock()
|
||||
.unwrap()
|
||||
.remove(&running_info);
|
||||
|
||||
acs_and_graphs
|
||||
}),
|
||||
);
|
||||
|
||||
spawn(acs_and_graphs_fut.then(move |acs_and_graphs_res| async move {
|
||||
let acs_and_graphs_enum: AcsAndGraphsOpt = match acs_and_graphs_res {
|
||||
Err(err) => AcsAndGraphsOpt::Error(err.to_string()),
|
||||
Ok(Err(err)) => AcsAndGraphsOpt::Error(err.to_string()),
|
||||
Ok(Ok(acs_and_graphs)) => AcsAndGraphsOpt::Some(acs_and_graphs),
|
||||
};
|
||||
|
||||
let result = adf_coll.update_one(doc! { "name": problem_name, "username": username }, match adf_problem_input.strategy {
|
||||
Strategy::Complete => doc! { "$set": { "acs_per_strategy.complete": &acs_and_graphs_enum } },
|
||||
Strategy::Ground => doc! { "$set": { "acs_per_strategy.ground": &acs_and_graphs_enum } },
|
||||
Strategy::Stable => doc! { "$set": { "acs_per_strategy.stable": &acs_and_graphs_enum } },
|
||||
Strategy::StableCountingA => doc! { "$set": { "acs_per_strategy.stable_counting_a": &acs_and_graphs_enum } },
|
||||
Strategy::StableCountingB => doc! { "$set": { "acs_per_strategy.stable_counting_b": &acs_and_graphs_enum } },
|
||||
Strategy::StableNogood => doc! { "$set": { "acs_per_strategy.stable_nogood": &acs_and_graphs_enum } },
|
||||
}, None).await;
|
||||
|
||||
if let Err(err) = result {
|
||||
log::error!("{err}");
|
||||
}
|
||||
}));
|
||||
|
||||
HttpResponse::Ok().body("Solving started...")
|
||||
}
|
||||
|
||||
#[get("/{problem_name}")]
|
||||
async fn get_adf_problem(
|
||||
app_state: web::Data<AppState>,
|
||||
identity: Option<Identity>,
|
||||
path: web::Path<String>,
|
||||
) -> impl Responder {
|
||||
let problem_name = path.into_inner();
|
||||
let adf_coll: mongodb::Collection<AdfProblem> = app_state
|
||||
.mongodb_client
|
||||
.database(DB_NAME)
|
||||
.collection(ADF_COLL);
|
||||
|
||||
let username = match identity.map(|id| id.id()) {
|
||||
None => {
|
||||
return HttpResponse::Unauthorized().body("You need to login to get an ADF problem.")
|
||||
}
|
||||
Some(Err(err)) => return HttpResponse::InternalServerError().body(err.to_string()),
|
||||
Some(Ok(username)) => username,
|
||||
};
|
||||
|
||||
let adf_problem = match adf_coll
|
||||
.find_one(doc! { "name": &problem_name, "username": &username }, None)
|
||||
.await
|
||||
{
|
||||
Err(err) => return HttpResponse::InternalServerError().body(err.to_string()),
|
||||
Ok(None) => {
|
||||
return HttpResponse::NotFound()
|
||||
.body(format!("ADF problem with name {problem_name} not found."))
|
||||
}
|
||||
Ok(Some(prob)) => prob,
|
||||
};
|
||||
|
||||
HttpResponse::Ok().json(AdfProblemInfo::from_adf_prob_and_tasks(
|
||||
adf_problem,
|
||||
&app_state.currently_running.lock().unwrap(),
|
||||
))
|
||||
}
|
||||
|
||||
#[delete("/{problem_name}")]
|
||||
async fn delete_adf_problem(
|
||||
app_state: web::Data<AppState>,
|
||||
identity: Option<Identity>,
|
||||
path: web::Path<String>,
|
||||
) -> impl Responder {
|
||||
let problem_name = path.into_inner();
|
||||
let adf_coll: mongodb::Collection<AdfProblem> = app_state
|
||||
.mongodb_client
|
||||
.database(DB_NAME)
|
||||
.collection(ADF_COLL);
|
||||
|
||||
let username = match identity.map(|id| id.id()) {
|
||||
None => {
|
||||
return HttpResponse::Unauthorized().body("You need to login to get an ADF problem.")
|
||||
}
|
||||
Some(Err(err)) => return HttpResponse::InternalServerError().body(err.to_string()),
|
||||
Some(Ok(username)) => username,
|
||||
};
|
||||
|
||||
match adf_coll
|
||||
.delete_one(doc! { "name": &problem_name, "username": &username }, None)
|
||||
.await
|
||||
{
|
||||
Ok(DeleteResult {
|
||||
deleted_count: 0, ..
|
||||
}) => HttpResponse::InternalServerError().body("Adf Problem could not be deleted."),
|
||||
Ok(DeleteResult {
|
||||
deleted_count: 1, ..
|
||||
}) => HttpResponse::Ok().body("Adf Problem deleted."),
|
||||
Ok(_) => {
|
||||
unreachable!("delete_one removes at most one entry so all cases are covered already")
|
||||
}
|
||||
Err(err) => HttpResponse::InternalServerError().body(err.to_string()),
|
||||
}
|
||||
}
|
||||
|
||||
#[get("/")]
|
||||
async fn get_adf_problems_for_user(
|
||||
app_state: web::Data<AppState>,
|
||||
identity: Option<Identity>,
|
||||
) -> impl Responder {
|
||||
let adf_coll: mongodb::Collection<AdfProblem> = app_state
|
||||
.mongodb_client
|
||||
.database(DB_NAME)
|
||||
.collection(ADF_COLL);
|
||||
|
||||
let username = match identity.map(|id| id.id()) {
|
||||
None => {
|
||||
return HttpResponse::Unauthorized().body("You need to login to get an ADF problem.")
|
||||
}
|
||||
Some(Err(err)) => return HttpResponse::InternalServerError().body(err.to_string()),
|
||||
Some(Ok(username)) => username,
|
||||
};
|
||||
|
||||
let adf_problem_cursor = match adf_coll.find(doc! { "username": &username }, None).await {
|
||||
Err(err) => return HttpResponse::InternalServerError().body(err.to_string()),
|
||||
Ok(cursor) => cursor,
|
||||
};
|
||||
|
||||
let adf_problems: Vec<AdfProblemInfo> = match adf_problem_cursor
|
||||
.map_ok(|adf_problem| {
|
||||
AdfProblemInfo::from_adf_prob_and_tasks(
|
||||
adf_problem,
|
||||
&app_state.currently_running.lock().unwrap(),
|
||||
)
|
||||
})
|
||||
.try_collect()
|
||||
.await
|
||||
{
|
||||
Err(err) => return HttpResponse::InternalServerError().body(err.to_string()),
|
||||
Ok(probs) => probs,
|
||||
};
|
||||
|
||||
HttpResponse::Ok().json(adf_problems)
|
||||
}
|
||||
37
server/src/config.rs
Normal file
@ -0,0 +1,37 @@
|
||||
use std::collections::HashSet;
|
||||
use std::sync::Mutex;
|
||||
use std::time::Duration;
|
||||
|
||||
use mongodb::Client;
|
||||
use serde::Serialize;
|
||||
|
||||
use crate::adf::Strategy;
|
||||
|
||||
pub(crate) const COOKIE_DURATION: actix_web::cookie::time::Duration =
|
||||
actix_web::cookie::time::Duration::minutes(30);
|
||||
pub(crate) const COMPUTE_TIME: Duration = Duration::from_secs(120);
|
||||
|
||||
pub(crate) const ASSET_DIRECTORY: &str = "./assets";
|
||||
|
||||
pub(crate) const DB_NAME: &str = "adf-obdd";
|
||||
pub(crate) const USER_COLL: &str = "users";
|
||||
pub(crate) const ADF_COLL: &str = "adf-problems";
|
||||
|
||||
#[derive(Copy, Clone, Debug, PartialEq, Eq, Hash, Serialize)]
|
||||
#[serde(tag = "type", content = "content")]
|
||||
pub(crate) enum Task {
|
||||
Parse,
|
||||
Solve(Strategy),
|
||||
}
|
||||
|
||||
#[derive(Clone, Debug, PartialEq, Eq, Hash)]
|
||||
pub(crate) struct RunningInfo {
|
||||
pub(crate) username: String,
|
||||
pub(crate) adf_name: String,
|
||||
pub(crate) task: Task,
|
||||
}
|
||||
|
||||
pub(crate) struct AppState {
|
||||
pub(crate) mongodb_client: Client,
|
||||
pub(crate) currently_running: Mutex<HashSet<RunningInfo>>,
|
||||
}
|
||||
118
server/src/double_labeled_graph.rs
Normal file
@ -0,0 +1,118 @@
|
||||
use serde::{Deserialize, Serialize};
|
||||
use std::collections::{HashMap, HashSet};
|
||||
|
||||
use adf_bdd::adf::Adf;
|
||||
use adf_bdd::datatypes::{Term, Var};
|
||||
|
||||
#[derive(Clone, Deserialize, Serialize, Debug)]
|
||||
/// This is a DTO for the graph output
|
||||
pub struct DoubleLabeledGraph {
|
||||
// number of nodes equals the number of node labels
|
||||
// nodes implicitly have their index as their ID
|
||||
node_labels: HashMap<String, String>,
|
||||
// every node gets this label containing multiple entries (it might be empty)
|
||||
tree_root_labels: HashMap<String, Vec<String>>,
|
||||
lo_edges: Vec<(String, String)>,
|
||||
hi_edges: Vec<(String, String)>,
|
||||
}
|
||||
|
||||
impl DoubleLabeledGraph {
|
||||
pub fn from_adf_and_ac(adf: &Adf, ac: Option<&Vec<Term>>) -> Self {
|
||||
let ac: &Vec<Term> = match ac {
|
||||
Some(ac) => ac,
|
||||
None => &adf.ac,
|
||||
};
|
||||
|
||||
let mut node_indices: HashSet<usize> = HashSet::new();
|
||||
let mut new_node_indices: HashSet<usize> = ac.iter().map(|term| term.value()).collect();
|
||||
|
||||
while !new_node_indices.is_empty() {
|
||||
node_indices = node_indices.union(&new_node_indices).copied().collect();
|
||||
new_node_indices = HashSet::new();
|
||||
|
||||
for node_index in &node_indices {
|
||||
let lo_node_index = adf.bdd.nodes[*node_index].lo().value();
|
||||
if !node_indices.contains(&lo_node_index) {
|
||||
new_node_indices.insert(lo_node_index);
|
||||
}
|
||||
|
||||
let hi_node_index = adf.bdd.nodes[*node_index].hi().value();
|
||||
if !node_indices.contains(&hi_node_index) {
|
||||
new_node_indices.insert(hi_node_index);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
let node_labels: HashMap<String, String> = adf
|
||||
.bdd
|
||||
.nodes
|
||||
.iter()
|
||||
.enumerate()
|
||||
.filter(|(i, _)| node_indices.contains(i))
|
||||
.map(|(i, &node)| {
|
||||
let value_part = match node.var() {
|
||||
Var::TOP => "TOP".to_string(),
|
||||
Var::BOT => "BOT".to_string(),
|
||||
_ => adf.ordering.name(node.var()).expect(
|
||||
"name for each var should exist; special cases are handled separately",
|
||||
),
|
||||
};
|
||||
|
||||
(i.to_string(), value_part)
|
||||
})
|
||||
.collect();
|
||||
|
||||
let tree_root_labels_with_usize: HashMap<usize, Vec<String>> = ac.iter().enumerate().fold(
|
||||
adf.bdd
|
||||
.nodes
|
||||
.iter()
|
||||
.enumerate()
|
||||
.filter(|(i, _)| node_indices.contains(i))
|
||||
.map(|(i, _)| (i, vec![]))
|
||||
.collect(),
|
||||
|mut acc, (root_for, root_node)| {
|
||||
acc.get_mut(&root_node.value())
|
||||
.expect("we know that the index will be in the map")
|
||||
.push(adf.ordering.name(Var(root_for)).expect(
|
||||
"name for each var should exist; special cases are handled separately",
|
||||
));
|
||||
|
||||
acc
|
||||
},
|
||||
);
|
||||
|
||||
let tree_root_labels: HashMap<String, Vec<String>> = tree_root_labels_with_usize
|
||||
.into_iter()
|
||||
.map(|(i, vec)| (i.to_string(), vec))
|
||||
.collect();
|
||||
|
||||
let lo_edges: Vec<(String, String)> = adf
|
||||
.bdd
|
||||
.nodes
|
||||
.iter()
|
||||
.enumerate()
|
||||
.filter(|(i, _)| node_indices.contains(i))
|
||||
.filter(|(_, node)| !vec![Var::TOP, Var::BOT].contains(&node.var()))
|
||||
.map(|(i, &node)| (i, node.lo().value()))
|
||||
.map(|(i, v)| (i.to_string(), v.to_string()))
|
||||
.collect();
|
||||
|
||||
let hi_edges: Vec<(String, String)> = adf
|
||||
.bdd
|
||||
.nodes
|
||||
.iter()
|
||||
.enumerate()
|
||||
.filter(|(i, _)| node_indices.contains(i))
|
||||
.filter(|(_, node)| !vec![Var::TOP, Var::BOT].contains(&node.var()))
|
||||
.map(|(i, &node)| (i, node.hi().value()))
|
||||
.map(|(i, v)| (i.to_string(), v.to_string()))
|
||||
.collect();
|
||||
|
||||
DoubleLabeledGraph {
|
||||
node_labels,
|
||||
tree_root_labels,
|
||||
lo_edges,
|
||||
hi_edges,
|
||||
}
|
||||
}
|
||||
}
|
||||
116
server/src/main.rs
Normal file
@ -0,0 +1,116 @@
|
||||
use std::collections::HashSet;
|
||||
use std::sync::Mutex;
|
||||
|
||||
use actix_files as fs;
|
||||
use actix_identity::IdentityMiddleware;
|
||||
use actix_session::config::PersistentSession;
|
||||
use actix_session::storage::CookieSessionStore;
|
||||
use actix_session::SessionMiddleware;
|
||||
use actix_web::cookie::Key;
|
||||
use actix_web::dev::{fn_service, ServiceRequest, ServiceResponse};
|
||||
use actix_web::{web, App, HttpServer};
|
||||
use fs::NamedFile;
|
||||
use mongodb::Client;
|
||||
|
||||
#[cfg(feature = "cors_for_local_development")]
|
||||
use actix_cors::Cors;
|
||||
|
||||
mod adf;
|
||||
mod config;
|
||||
mod double_labeled_graph;
|
||||
mod user;
|
||||
|
||||
use adf::{
|
||||
add_adf_problem, delete_adf_problem, get_adf_problem, get_adf_problems_for_user,
|
||||
solve_adf_problem,
|
||||
};
|
||||
use config::{AppState, ASSET_DIRECTORY, COOKIE_DURATION};
|
||||
use user::{
|
||||
create_username_index, delete_account, login, logout, register, update_user, user_info,
|
||||
};
|
||||
|
||||
#[actix_web::main]
|
||||
async fn main() -> std::io::Result<()> {
|
||||
env_logger::builder()
|
||||
.filter_level(log::LevelFilter::Debug)
|
||||
.init();
|
||||
|
||||
// setup mongodb
|
||||
let mongodb_uri =
|
||||
std::env::var("MONGODB_URI").unwrap_or_else(|_| "mongodb://localhost:27017".into());
|
||||
let client = Client::with_uri_str(mongodb_uri)
|
||||
.await
|
||||
.expect("failed to connect to mongodb");
|
||||
create_username_index(&client).await;
|
||||
|
||||
// cookie secret ket
|
||||
let secret_key = Key::generate();
|
||||
|
||||
// needs to be set outside of httpserver closure to only create it once!
|
||||
let app_data = web::Data::new(AppState {
|
||||
mongodb_client: client.clone(),
|
||||
currently_running: Mutex::new(HashSet::new()),
|
||||
});
|
||||
|
||||
HttpServer::new(move || {
|
||||
let app = App::new();
|
||||
|
||||
#[cfg(feature = "cors_for_local_development")]
|
||||
let cors = Cors::default()
|
||||
.allowed_origin("http://localhost:1234")
|
||||
.allow_any_method()
|
||||
.allow_any_header()
|
||||
.supports_credentials()
|
||||
.max_age(3600);
|
||||
|
||||
#[cfg(feature = "cors_for_local_development")]
|
||||
let app = app.wrap(cors);
|
||||
|
||||
#[cfg(feature = "cors_for_local_development")]
|
||||
let cookie_secure = false;
|
||||
#[cfg(not(feature = "cors_for_local_development"))]
|
||||
let cookie_secure = true;
|
||||
|
||||
app.app_data(app_data.clone())
|
||||
.wrap(IdentityMiddleware::default())
|
||||
.wrap(
|
||||
SessionMiddleware::builder(CookieSessionStore::default(), secret_key.clone())
|
||||
.cookie_name("adf-obdd-service-auth".to_owned())
|
||||
.cookie_secure(cookie_secure)
|
||||
.session_lifecycle(PersistentSession::default().session_ttl(COOKIE_DURATION))
|
||||
.build(),
|
||||
)
|
||||
.service(
|
||||
web::scope("/users")
|
||||
.service(register)
|
||||
.service(delete_account)
|
||||
.service(login)
|
||||
.service(logout)
|
||||
.service(user_info)
|
||||
.service(update_user),
|
||||
)
|
||||
.service(
|
||||
web::scope("/adf")
|
||||
.service(add_adf_problem)
|
||||
.service(solve_adf_problem)
|
||||
.service(get_adf_problem)
|
||||
.service(delete_adf_problem)
|
||||
.service(get_adf_problems_for_user),
|
||||
)
|
||||
// this mus be last to not override anything
|
||||
.service(
|
||||
fs::Files::new("/", ASSET_DIRECTORY)
|
||||
.index_file("index.html")
|
||||
.default_handler(fn_service(|req: ServiceRequest| async {
|
||||
let (req, _) = req.into_parts();
|
||||
let file =
|
||||
NamedFile::open_async(format!("{ASSET_DIRECTORY}/index.html")).await?;
|
||||
let res = file.into_response(&req);
|
||||
Ok(ServiceResponse::new(req, res))
|
||||
})),
|
||||
)
|
||||
})
|
||||
.bind(("0.0.0.0", 8080))?
|
||||
.run()
|
||||
.await
|
||||
}
|
||||
365
server/src/user.rs
Normal file
@ -0,0 +1,365 @@
|
||||
use actix_identity::Identity;
|
||||
use actix_web::{delete, get, post, put, web, HttpMessage, HttpRequest, HttpResponse, Responder};
|
||||
use argon2::password_hash::rand_core::OsRng;
|
||||
use argon2::password_hash::SaltString;
|
||||
use argon2::{Argon2, PasswordHash, PasswordHasher, PasswordVerifier};
|
||||
use mongodb::results::{DeleteResult, UpdateResult};
|
||||
use mongodb::{bson::doc, options::IndexOptions, Client, IndexModel};
|
||||
use serde::{Deserialize, Serialize};
|
||||
|
||||
use crate::adf::AdfProblem;
|
||||
use crate::config::{AppState, ADF_COLL, DB_NAME, USER_COLL};
|
||||
|
||||
#[derive(Deserialize, Serialize)]
|
||||
pub(crate) struct User {
|
||||
pub(crate) username: String,
|
||||
pub(crate) password: Option<String>, // NOTE: Password being None indicates a temporary user
|
||||
}
|
||||
|
||||
#[derive(Deserialize, Serialize)]
|
||||
struct UserPayload {
|
||||
username: String,
|
||||
password: String,
|
||||
}
|
||||
|
||||
#[derive(Deserialize, Serialize)]
|
||||
struct UserInfo {
|
||||
username: String,
|
||||
temp: bool,
|
||||
}
|
||||
|
||||
// Creates an index on the "username" field to force the values to be unique.
|
||||
pub(crate) async fn create_username_index(client: &Client) {
|
||||
let options = IndexOptions::builder().unique(true).build();
|
||||
let model = IndexModel::builder()
|
||||
.keys(doc! { "username": 1 })
|
||||
.options(options)
|
||||
.build();
|
||||
client
|
||||
.database(DB_NAME)
|
||||
.collection::<User>(USER_COLL)
|
||||
.create_index(model, None)
|
||||
.await
|
||||
.expect("creating an index should succeed");
|
||||
}
|
||||
|
||||
pub(crate) async fn username_exists(user_coll: &mongodb::Collection<User>, username: &str) -> bool {
|
||||
user_coll
|
||||
.find_one(doc! { "username": username }, None)
|
||||
.await
|
||||
.ok()
|
||||
.flatten()
|
||||
.is_some()
|
||||
}
|
||||
|
||||
// Add new user
|
||||
#[post("/register")]
|
||||
async fn register(app_state: web::Data<AppState>, user: web::Json<UserPayload>) -> impl Responder {
|
||||
let mut user: UserPayload = user.into_inner();
|
||||
|
||||
if user.username.is_empty() || user.password.is_empty() {
|
||||
return HttpResponse::BadRequest().body("Username and Password need to be set!");
|
||||
}
|
||||
|
||||
let user_coll = app_state
|
||||
.mongodb_client
|
||||
.database(DB_NAME)
|
||||
.collection(USER_COLL);
|
||||
|
||||
if username_exists(&user_coll, &user.username).await {
|
||||
return HttpResponse::Conflict()
|
||||
.body("Username is already taken. Please pick another one!");
|
||||
}
|
||||
|
||||
let pw = &user.password;
|
||||
let salt = SaltString::generate(&mut OsRng);
|
||||
let hashed_pw = Argon2::default()
|
||||
.hash_password(pw.as_bytes(), &salt)
|
||||
.expect("Error while hashing password!")
|
||||
.to_string();
|
||||
|
||||
user.password = hashed_pw;
|
||||
|
||||
let result = user_coll
|
||||
.insert_one(
|
||||
User {
|
||||
username: user.username,
|
||||
password: Some(user.password),
|
||||
},
|
||||
None,
|
||||
)
|
||||
.await;
|
||||
match result {
|
||||
Ok(_) => HttpResponse::Ok().body("Registration successful!"),
|
||||
Err(err) => HttpResponse::InternalServerError().body(err.to_string()),
|
||||
}
|
||||
}
|
||||
|
||||
// Remove user
|
||||
#[delete("/delete")]
|
||||
async fn delete_account(
|
||||
app_state: web::Data<AppState>,
|
||||
identity: Option<Identity>,
|
||||
) -> impl Responder {
|
||||
let user_coll: mongodb::Collection<User> = app_state
|
||||
.mongodb_client
|
||||
.database(DB_NAME)
|
||||
.collection(USER_COLL);
|
||||
let adf_coll: mongodb::Collection<AdfProblem> = app_state
|
||||
.mongodb_client
|
||||
.database(DB_NAME)
|
||||
.collection(ADF_COLL);
|
||||
|
||||
match identity {
|
||||
None => HttpResponse::Unauthorized().body("You are not logged in."),
|
||||
Some(id) => match id.id() {
|
||||
Err(err) => HttpResponse::InternalServerError().body(err.to_string()),
|
||||
Ok(username) => {
|
||||
// Delete all adfs created by user
|
||||
match adf_coll
|
||||
.delete_many(doc! { "username": &username }, None)
|
||||
.await
|
||||
{
|
||||
Err(err) => HttpResponse::InternalServerError().body(err.to_string()),
|
||||
Ok(DeleteResult {
|
||||
deleted_count: _, ..
|
||||
}) => {
|
||||
// Delete actual user
|
||||
match user_coll
|
||||
.delete_one(doc! { "username": &username }, None)
|
||||
.await
|
||||
{
|
||||
Ok(DeleteResult {
|
||||
deleted_count: 0, ..
|
||||
}) => HttpResponse::InternalServerError()
|
||||
.body("Account could not be deleted."),
|
||||
Ok(DeleteResult {
|
||||
deleted_count: 1, ..
|
||||
}) => {
|
||||
id.logout();
|
||||
HttpResponse::Ok().body("Account deleted.")
|
||||
}
|
||||
Ok(_) => unreachable!(
|
||||
"delete_one removes at most one entry so all cases are covered already"
|
||||
),
|
||||
Err(err) => HttpResponse::InternalServerError().body(err.to_string()),
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
},
|
||||
}
|
||||
}
|
||||
|
||||
// Login
|
||||
#[post("/login")]
|
||||
async fn login(
|
||||
req: HttpRequest,
|
||||
app_state: web::Data<AppState>,
|
||||
user_data: web::Json<UserPayload>,
|
||||
) -> impl Responder {
|
||||
let username = &user_data.username;
|
||||
let pw = &user_data.password;
|
||||
|
||||
if username.is_empty() || pw.is_empty() {
|
||||
return HttpResponse::BadRequest().body("Username and Password need to be set!");
|
||||
}
|
||||
|
||||
let user_coll: mongodb::Collection<User> = app_state
|
||||
.mongodb_client
|
||||
.database(DB_NAME)
|
||||
.collection(USER_COLL);
|
||||
match user_coll
|
||||
.find_one(doc! { "username": username }, None)
|
||||
.await
|
||||
{
|
||||
Ok(Some(user)) => {
|
||||
let stored_password = match &user.password {
|
||||
None => return HttpResponse::BadRequest().body("Invalid username or password"), // NOTE: login as tremporary user is not allowed
|
||||
Some(password) => password,
|
||||
};
|
||||
|
||||
let stored_hash = PasswordHash::new(stored_password).unwrap();
|
||||
let pw_valid = Argon2::default()
|
||||
.verify_password(pw.as_bytes(), &stored_hash)
|
||||
.is_ok();
|
||||
|
||||
if pw_valid {
|
||||
Identity::login(&req.extensions(), username.to_string()).unwrap();
|
||||
HttpResponse::Ok().body("Login successful!")
|
||||
} else {
|
||||
HttpResponse::BadRequest().body("Invalid email or password")
|
||||
}
|
||||
}
|
||||
Ok(None) => HttpResponse::NotFound().body(format!(
|
||||
"No user found with username {}",
|
||||
&user_data.username
|
||||
)),
|
||||
Err(err) => HttpResponse::InternalServerError().body(err.to_string()),
|
||||
}
|
||||
}
|
||||
|
||||
#[delete("/logout")]
|
||||
async fn logout(app_state: web::Data<AppState>, id: Option<Identity>) -> impl Responder {
|
||||
let user_coll: mongodb::Collection<User> = app_state
|
||||
.mongodb_client
|
||||
.database(DB_NAME)
|
||||
.collection(USER_COLL);
|
||||
|
||||
match id {
|
||||
None => HttpResponse::Unauthorized().body("You are not logged in."),
|
||||
Some(id) => match id.id() {
|
||||
Err(err) => HttpResponse::InternalServerError().body(err.to_string()),
|
||||
Ok(username) => {
|
||||
let user: User = match user_coll
|
||||
.find_one(doc! { "username": &username }, None)
|
||||
.await
|
||||
{
|
||||
Ok(Some(user)) => user,
|
||||
Ok(None) => {
|
||||
return HttpResponse::NotFound()
|
||||
.body(format!("No user found with username {}", &username))
|
||||
}
|
||||
Err(err) => return HttpResponse::InternalServerError().body(err.to_string()),
|
||||
};
|
||||
|
||||
if user.password.is_none() {
|
||||
HttpResponse::BadRequest().body("You are logged in as a temporary user so we won't log you out because you will not be able to login again. If you want to be able to login again, set a password. Otherwise your session will expire automatically at a certain point.")
|
||||
} else {
|
||||
id.logout();
|
||||
HttpResponse::Ok().body("Logout successful!")
|
||||
}
|
||||
}
|
||||
},
|
||||
}
|
||||
}
|
||||
|
||||
// Get current user
|
||||
#[get("/info")]
|
||||
async fn user_info(app_state: web::Data<AppState>, identity: Option<Identity>) -> impl Responder {
|
||||
let user_coll: mongodb::Collection<User> = app_state
|
||||
.mongodb_client
|
||||
.database(DB_NAME)
|
||||
.collection(USER_COLL);
|
||||
|
||||
match identity {
|
||||
None => {
|
||||
HttpResponse::Unauthorized().body("You need to login get your account information.")
|
||||
}
|
||||
Some(id) => match id.id() {
|
||||
Err(err) => HttpResponse::InternalServerError().body(err.to_string()),
|
||||
Ok(username) => {
|
||||
match user_coll
|
||||
.find_one(doc! { "username": &username }, None)
|
||||
.await
|
||||
{
|
||||
Ok(Some(user)) => {
|
||||
let info = UserInfo {
|
||||
username: user.username,
|
||||
temp: user.password.is_none(),
|
||||
};
|
||||
|
||||
HttpResponse::Ok().json(info)
|
||||
}
|
||||
Ok(None) => {
|
||||
id.logout();
|
||||
HttpResponse::NotFound().body("Logged in user does not exist anymore.")
|
||||
}
|
||||
Err(err) => HttpResponse::InternalServerError().body(err.to_string()),
|
||||
}
|
||||
}
|
||||
},
|
||||
}
|
||||
}
|
||||
|
||||
// Update current user
|
||||
#[put("/update")]
|
||||
async fn update_user(
|
||||
req: HttpRequest,
|
||||
app_state: web::Data<AppState>,
|
||||
identity: Option<Identity>,
|
||||
user: web::Json<UserPayload>,
|
||||
) -> impl Responder {
|
||||
let mut user: UserPayload = user.into_inner();
|
||||
|
||||
if user.username.is_empty() || user.password.is_empty() {
|
||||
return HttpResponse::BadRequest().body("Username and Password need to be set!");
|
||||
}
|
||||
|
||||
let user_coll = app_state
|
||||
.mongodb_client
|
||||
.database(DB_NAME)
|
||||
.collection(USER_COLL);
|
||||
let adf_coll: mongodb::Collection<AdfProblem> = app_state
|
||||
.mongodb_client
|
||||
.database(DB_NAME)
|
||||
.collection(ADF_COLL);
|
||||
|
||||
match identity {
|
||||
None => {
|
||||
HttpResponse::Unauthorized().body("You need to login get your account information.")
|
||||
}
|
||||
Some(id) => match id.id() {
|
||||
Err(err) => HttpResponse::InternalServerError().body(err.to_string()),
|
||||
Ok(username) => {
|
||||
if user.username != username && username_exists(&user_coll, &user.username).await {
|
||||
return HttpResponse::Conflict()
|
||||
.body("Username is already taken. Please pick another one!");
|
||||
}
|
||||
|
||||
let pw = &user.password;
|
||||
let salt = SaltString::generate(&mut OsRng);
|
||||
let hashed_pw = Argon2::default()
|
||||
.hash_password(pw.as_bytes(), &salt)
|
||||
.expect("Error while hashing password!")
|
||||
.to_string();
|
||||
|
||||
user.password = hashed_pw;
|
||||
|
||||
let result = user_coll
|
||||
.replace_one(
|
||||
doc! { "username": &username },
|
||||
User {
|
||||
username: user.username.clone(),
|
||||
password: Some(user.password),
|
||||
},
|
||||
None,
|
||||
)
|
||||
.await;
|
||||
match result {
|
||||
Ok(UpdateResult {
|
||||
modified_count: 0, ..
|
||||
}) => HttpResponse::InternalServerError().body("Account could not be updated."),
|
||||
Ok(UpdateResult {
|
||||
modified_count: 1, ..
|
||||
}) => {
|
||||
// re-login with new username
|
||||
Identity::login(&req.extensions(), user.username.clone()).unwrap();
|
||||
|
||||
// update all adf problems of user
|
||||
match adf_coll
|
||||
.update_many(
|
||||
doc! { "username": &username },
|
||||
doc! { "$set": { "username": &user.username } },
|
||||
None,
|
||||
)
|
||||
.await
|
||||
{
|
||||
Err(err) => HttpResponse::InternalServerError().body(err.to_string()),
|
||||
Ok(UpdateResult {
|
||||
modified_count: _, ..
|
||||
}) => HttpResponse::Ok().json(UserInfo {
|
||||
username: user.username,
|
||||
temp: false,
|
||||
}),
|
||||
}
|
||||
}
|
||||
Ok(_) => unreachable!(
|
||||
"replace_one replaces at most one entry so all cases are covered already"
|
||||
),
|
||||
Err(err) => HttpResponse::InternalServerError().body(err.to_string()),
|
||||
}
|
||||
}
|
||||
},
|
||||
}
|
||||
}
|
||||