Merge branch 'develop' of github.com:Budibase/budibase into develop

This commit is contained in:
Martin McKeaveney 2023-09-06 18:30:16 +01:00
commit 4668315acd
63 changed files with 502 additions and 545 deletions

View File

@ -115,77 +115,4 @@ This job is responsible for deploying to our production, cloud kubernetes enviro
### Rollback A Bad Cloud Deployment ### Rollback A Bad Cloud Deployment
- Kick off cloud deploy job - Kick off cloud deploy job
- Ensure you are running off master - Ensure you are running off master
- Enter the version number of the last known good version of budibase. For example `1.0.0` - Enter the version number of the last known good version of budibase. For example `1.0.0`
## Pro
| **NOTE**: When developing for both pro / budibase repositories, your branch names need to match, or else the correct pro doesn't get run within your CI job.
### Installing Pro
The pro package is always installed from source in our CI jobs.
This is done to prevent pro needing to be published prior to CI runs in budiabse. This is required for two reasons:
- To reduce developer need to manually bump versions, i.e:
- release pro, bump pro dep in budibase, now ci can run successfully
- The cyclic dependency on backend-core, i.e:
- pro depends on backend-core
- server depends on pro
- backend-core lives in the monorepo, so it can't be released independently to be used in pro
- therefore the only option is to pull pro from source and release it as a part of the monorepo release, as if it were a mono package
The install is performed using the same steps as local development, via the `yarn bootstrap` command, see the [Contributing Guide#Pro](../../docs/CONTRIBUTING.md#pro)
The branch to install pro from can vary depending on ref of the commit that triggered the budibase CI job. This is done to enable branches which have changes in both the monorepo and the pro repo to have their CI pass successfully.
This is done using the [pro/install.sh](../../scripts/pro/install.sh) script. The script will:
- Clone pro to it's default branch (`develop`)
- Check if the clone worked, on forked versions of budibase this will fail due to no access
- This is fine as the `yarn` command will install the version from NPM
- Community PRs should never touch pro so this will always work
- Checkout the `BRANCH` argument, if this fails fallback to `BASE_BRANCH`
- This enables the more complex case of a feature branch being merged to another feature branch, e.g.
- I am working on a branch `epic/stonks` which exists on budibase and pro.
- I want to merge a change to this branch in budibase from `feature/stonks-ui`, which only exists in budibase
- The base branch ensures that `epic/stonks` in pro will still be checked out for the CI run, rather than falling back to `develop`
- Run `yarn setup` to build and install dependencies
- `yarn`
- `yarn bootstrap`
- `yarn build`
- The will build .ts files, and also update the `main` and `types` of `package.json` to point to `dist` rather than src
- The build command will only ever work in CI, it is prevented in local dev
#### `BRANCH` and `BASE_BRANCH` arguments
These arguments are supplied by the various budibase build and release pipelines
- `budibase_ci`
- `BRANCH: ${{ github.event.pull_request.head.ref }}` -> The branch being merged
- `BASE_BRANCH: ${{ github.event.pull_request.base.ref}}` -> The base branch
- `release-develop`
- `BRANCH: develop` -> always use the `develop` branch in pro
- `release`
- `BRANCH: master` -> always use the `master` branch in pro
### Releasing Pro
After budibase dependencies have been released we will release the new version of pro to match the release version of budibase dependencies. This is to ensure that we are always keeping the version of `backend-core` in sync in the pro package and in budibase packages. Without this we could run into scenarios where different versions are being used when installed via `yarn` inside the docker images, creating very difficult to debug cases.
Pro is released using the [pro/release.sh](../../scripts/pro/release.sh) script. The script will:
- Inspect the `VERSION` from the `lerna.json` file in budibase
- Determine whether to use the `latest` or `develop` tag based on the command argument
- Go to pro directory
- install npm creds
- update the version of `backend-core` to be `VERSION`, the version just released by lerna
- publish to npm. Uses a `lerna publish` command, pro itself is a mono repo.
- force the version to be the same as `VERSION` to keep pro and budibase in sync
- reverts the changes to `main` and `types` in `package.json` that were made by the build step, to point back to source
- commit & push: `Prep next development iteration`
- Go to budibase
- Update to the new version of pro in `server` and `worker` so the latest pro version is used in the docker builds
- commit & push: `Update pro version to $VERSION`
#### `COMMAND` argument
This argument is supplied by the existing `release` and `release:develop` budibase commands, which invoke the pro release
- `release` will supply no command and default to use `latest`
- `release:develop` will supply `develop`

View File

@ -67,7 +67,6 @@ jobs:
- name: Bootstrap and build (CLI) - name: Bootstrap and build (CLI)
run: | run: |
yarn yarn
yarn bootstrap
yarn build yarn build
- name: Build OpenAPI spec - name: Build OpenAPI spec

View File

@ -1,4 +1,5 @@
{{- if .Values.globals.createSecrets -}} {{- $existingSecret := lookup "v1" "Secret" .Release.Namespace (include "budibase.fullname" .) }}
{{- if .Values.globals.createSecrets }}
apiVersion: v1 apiVersion: v1
kind: Secret kind: Secret
metadata: metadata:
@ -10,8 +11,15 @@ metadata:
heritage: "{{ .Release.Service }}" heritage: "{{ .Release.Service }}"
type: Opaque type: Opaque
data: data:
{{- if $existingSecret }}
internalApiKey: {{ index $existingSecret.data "internalApiKey" }}
jwtSecret: {{ index $existingSecret.data "jwtSecret" }}
objectStoreAccess: {{ index $existingSecret.data "objectStoreAccess" }}
objectStoreSecret: {{ index $existingSecret.data "objectStoreSecret" }}
{{- else }}
internalApiKey: {{ template "budibase.defaultsecret" .Values.globals.internalApiKey }} internalApiKey: {{ template "budibase.defaultsecret" .Values.globals.internalApiKey }}
jwtSecret: {{ template "budibase.defaultsecret" .Values.globals.jwtSecret }} jwtSecret: {{ template "budibase.defaultsecret" .Values.globals.jwtSecret }}
objectStoreAccess: {{ template "budibase.defaultsecret" .Values.services.objectStore.accessKey }} objectStoreAccess: {{ template "budibase.defaultsecret" .Values.services.objectStore.accessKey }}
objectStoreSecret: {{ template "budibase.defaultsecret" .Values.services.objectStore.secretKey }} objectStoreSecret: {{ template "budibase.defaultsecret" .Values.services.objectStore.secretKey }}
{{- end -}} {{- end }}
{{- end }}

View File

@ -55,7 +55,7 @@ yarn setup
The yarn setup command runs several build steps i.e. The yarn setup command runs several build steps i.e.
``` ```
node ./hosting/scripts/setup.js && yarn && yarn bootstrap && yarn build && yarn dev node ./hosting/scripts/setup.js && yarn && yarn build && yarn dev
``` ```
So this command will actually run the application in dev mode. It creates .env files under `./packages/server` and `./packages/worker` and runs docker containers for each service via docker-compose. So this command will actually run the application in dev mode. It creates .env files under `./packages/server` and `./packages/worker` and runs docker containers for each service via docker-compose.

View File

@ -55,7 +55,7 @@ yarn setup
The yarn setup command runs several build steps i.e. The yarn setup command runs several build steps i.e.
``` ```
node ./hosting/scripts/setup.js && yarn && yarn bootstrap && yarn build && yarn dev node ./hosting/scripts/setup.js && yarn && yarn build && yarn dev
``` ```
So this command will actually run the application in dev mode. It creates .env files under `./packages/server` and `./packages/worker` and runs docker containers for each service via docker-compose. So this command will actually run the application in dev mode. It creates .env files under `./packages/server` and `./packages/worker` and runs docker containers for each service via docker-compose.

View File

@ -74,7 +74,7 @@ yarn setup
The yarn setup command runs several build steps i.e. The yarn setup command runs several build steps i.e.
``` ```
node ./hosting/scripts/setup.js && yarn && yarn bootstrap && yarn build && yarn dev node ./hosting/scripts/setup.js && yarn && yarn build && yarn dev
``` ```
So this command will actually run the application in dev mode. It creates .env files under `./packages/server` and `./packages/worker` and runs docker containers for each service via docker-compose. So this command will actually run the application in dev mode. It creates .env files under `./packages/server` and `./packages/worker` and runs docker containers for each service via docker-compose.

View File

@ -58,7 +58,6 @@ Node setup:
``` ```
node ./hosting/scripts/setup.js node ./hosting/scripts/setup.js
yarn yarn
yarn bootstrap
yarn build yarn build
``` ```
#### Build Image #### Build Image

View File

@ -47,7 +47,6 @@ Node setup:
``` ```
node ./hosting/scripts/setup.js node ./hosting/scripts/setup.js
yarn yarn
yarn bootstrap
yarn build yarn build
``` ```
#### Build Image #### Build Image

View File

@ -1,5 +1,5 @@
{ {
"version": "2.9.33-alpha.15", "version": "2.9.39-alpha.10",
"npmClient": "yarn", "npmClient": "yarn",
"packages": [ "packages": [
"packages/*" "packages/*"

View File

@ -33,21 +33,18 @@
"scripts": { "scripts": {
"preinstall": "node scripts/syncProPackage.js", "preinstall": "node scripts/syncProPackage.js",
"setup": "git config submodule.recurse true && git submodule update && node ./hosting/scripts/setup.js && yarn && yarn build && yarn dev", "setup": "git config submodule.recurse true && git submodule update && node ./hosting/scripts/setup.js && yarn && yarn build && yarn dev",
"bootstrap": "./scripts/link-dependencies.sh && echo '***BOOTSTRAP ONLY REQUIRED FOR USE WITH ACCOUNT PORTAL***'",
"build": "lerna run build --stream", "build": "lerna run build --stream",
"build:dev": "lerna run --stream prebuild && yarn nx run-many --target=build --output-style=dynamic --watch --preserveWatchOutput", "build:dev": "lerna run --stream prebuild && yarn nx run-many --target=build --output-style=dynamic --watch --preserveWatchOutput",
"check:types": "lerna run check:types", "check:types": "lerna run check:types",
"backend:bootstrap": "./scripts/scopeBackend.sh && yarn run bootstrap",
"backend:build": "./scripts/scopeBackend.sh 'lerna run --stream build'",
"build:sdk": "lerna run --stream build:sdk", "build:sdk": "lerna run --stream build:sdk",
"deps:circular": "madge packages/server/dist/index.js packages/worker/src/index.ts packages/backend-core/dist/src/index.js packages/cli/src/index.js --circular", "deps:circular": "madge packages/server/dist/index.js packages/worker/src/index.ts packages/backend-core/dist/src/index.js packages/cli/src/index.js --circular",
"release": "lerna publish from-package --yes --force-publish --no-git-tag-version --no-push --no-git-reset", "release": "lerna publish from-package --yes --force-publish --no-git-tag-version --no-push --no-git-reset",
"release:develop": "yarn release --dist-tag develop", "release:develop": "yarn release --dist-tag develop",
"restore": "yarn run clean && yarn run bootstrap && yarn run build", "restore": "yarn run clean && yarn && yarn run build",
"nuke": "yarn run nuke:packages && yarn run nuke:docker", "nuke": "yarn run nuke:packages && yarn run nuke:docker",
"nuke:packages": "yarn run restore", "nuke:packages": "yarn run restore",
"nuke:docker": "lerna run --stream dev:stack:nuke", "nuke:docker": "lerna run --stream dev:stack:nuke",
"clean": "lerna clean", "clean": "lerna clean -y",
"kill-builder": "kill-port 3000", "kill-builder": "kill-port 3000",
"kill-server": "kill-port 4001 4002", "kill-server": "kill-port 4001 4002",
"kill-all": "yarn run kill-builder && yarn run kill-server", "kill-all": "yarn run kill-builder && yarn run kill-server",
@ -93,9 +90,8 @@
"mode:account": "yarn mode:cloud && yarn env:account:enable", "mode:account": "yarn mode:cloud && yarn env:account:enable",
"security:audit": "node scripts/audit.js", "security:audit": "node scripts/audit.js",
"postinstall": "husky install", "postinstall": "husky install",
"dep:clean": "yarn clean -y && yarn bootstrap", "submodules:load": "git submodule init && git submodule update && yarn",
"submodules:load": "git submodule init && git submodule update && yarn && yarn bootstrap", "submodules:unload": "git submodule deinit --all && yarn"
"submodules:unload": "git submodule deinit --all && yarn && yarn bootstrap"
}, },
"workspaces": { "workspaces": {
"packages": [ "packages": [

View File

@ -1,4 +1,6 @@
* *
!dist/**/* !dist/**/*
dist/tsconfig.build.tsbuildinfo dist/tsconfig.build.tsbuildinfo
!package.json !package.json
!src/**
!tests/**

View File

@ -6,7 +6,7 @@
"types": "dist/src/index.d.ts", "types": "dist/src/index.d.ts",
"exports": { "exports": {
".": "./dist/index.js", ".": "./dist/index.js",
"./tests": "./dist/tests.js", "./tests": "./dist/tests/index.js",
"./*": "./dist/*.js" "./*": "./dist/*.js"
}, },
"author": "Budibase", "author": "Budibase",
@ -14,7 +14,7 @@
"scripts": { "scripts": {
"prebuild": "rimraf dist/", "prebuild": "rimraf dist/",
"prepack": "cp package.json dist", "prepack": "cp package.json dist",
"build": "node ./scripts/build.js && tsc -p tsconfig.build.json --emitDeclarationOnly --paths null", "build": "tsc -p tsconfig.build.json --paths null && node ./scripts/build.js",
"build:dev": "yarn prebuild && tsc --build --watch --preserveWatchOutput", "build:dev": "yarn prebuild && tsc --build --watch --preserveWatchOutput",
"check:types": "tsc -p tsconfig.json --noEmit --paths null", "check:types": "tsc -p tsconfig.json --noEmit --paths null",
"test": "bash scripts/test.sh", "test": "bash scripts/test.sh",

View File

@ -1,6 +1,4 @@
#!/usr/bin/node #!/usr/bin/node
const coreBuild = require("../../../scripts/build") const coreBuild = require("../../../scripts/build")
coreBuild("./src/plugin/index.ts", "./dist/plugins.js")
coreBuild("./src/index.ts", "./dist/index.js") coreBuild("./src/index.ts", "./dist/index.js")
coreBuild("./tests/index.ts", "./dist/tests.js")

View File

@ -8,7 +8,6 @@ import {
DatabasePutOpts, DatabasePutOpts,
DatabaseCreateIndexOpts, DatabaseCreateIndexOpts,
DatabaseDeleteIndexOpts, DatabaseDeleteIndexOpts,
DocExistsResponse,
Document, Document,
isDocument, isDocument,
} from "@budibase/types" } from "@budibase/types"
@ -121,19 +120,6 @@ export class DatabaseImpl implements Database {
return this.updateOutput(() => db.get(id)) return this.updateOutput(() => db.get(id))
} }
async docExists(docId: string): Promise<DocExistsResponse> {
const db = await this.checkSetup()
let _rev, exists
try {
const { etag } = await db.head(docId)
_rev = etag
exists = true
} catch (err) {
exists = false
}
return { _rev, exists }
}
async remove(idOrDoc: string | Document, rev?: string) { async remove(idOrDoc: string | Document, rev?: string) {
const db = await this.checkSetup() const db = await this.checkSetup()
let _id: string let _id: string

View File

@ -380,8 +380,8 @@ export function getDBRoleID(roleName: string) {
export function getExternalRoleID(roleId: string, version?: string) { export function getExternalRoleID(roleId: string, version?: string) {
// for built-in roles we want to remove the DB role ID element (role_) // for built-in roles we want to remove the DB role ID element (role_)
if ( if (
(roleId.startsWith(DocumentType.ROLE) && isBuiltin(roleId)) || roleId.startsWith(DocumentType.ROLE) &&
version === RoleIDVersion.NAME (isBuiltin(roleId) || version === RoleIDVersion.NAME)
) { ) {
return roleId.split(`${DocumentType.ROLE}${SEPARATOR}`)[1] return roleId.split(`${DocumentType.ROLE}${SEPARATOR}`)[1]
} }

View File

@ -18,6 +18,7 @@ export default function positionDropdown(element, opts) {
useAnchorWidth, useAnchorWidth,
offset = 5, offset = 5,
customUpdate, customUpdate,
offsetBelow,
} = opts } = opts
if (!anchor) { if (!anchor) {
return return
@ -47,7 +48,7 @@ export default function positionDropdown(element, opts) {
styles.top = anchorBounds.top - elementBounds.height - offset styles.top = anchorBounds.top - elementBounds.height - offset
styles.maxHeight = maxHeight || 240 styles.maxHeight = maxHeight || 240
} else { } else {
styles.top = anchorBounds.bottom + offset styles.top = anchorBounds.bottom + (offsetBelow || offset)
styles.maxHeight = styles.maxHeight =
maxHeight || window.innerHeight - anchorBounds.bottom - 20 maxHeight || window.innerHeight - anchorBounds.bottom - 20
} }

View File

@ -17,6 +17,9 @@
export let fetchTerm = null export let fetchTerm = null
export let useFetch = false export let useFetch = false
export let customPopoverHeight export let customPopoverHeight
export let customPopoverOffsetBelow
export let customPopoverMaxHeight
export let open = false
const dispatch = createEventDispatcher() const dispatch = createEventDispatcher()
@ -88,6 +91,7 @@
isPlaceholder={!arrayValue.length} isPlaceholder={!arrayValue.length}
{autocomplete} {autocomplete}
bind:fetchTerm bind:fetchTerm
bind:open
{useFetch} {useFetch}
{isOptionSelected} {isOptionSelected}
{getOptionLabel} {getOptionLabel}
@ -96,4 +100,6 @@
{sort} {sort}
{autoWidth} {autoWidth}
{customPopoverHeight} {customPopoverHeight}
{customPopoverOffsetBelow}
{customPopoverMaxHeight}
/> />

View File

@ -38,6 +38,8 @@
export let fetchTerm = null export let fetchTerm = null
export let useFetch = false export let useFetch = false
export let customPopoverHeight export let customPopoverHeight
export let customPopoverOffsetBelow
export let customPopoverMaxHeight
export let align = "left" export let align = "left"
export let footer = null export let footer = null
export let customAnchor = null export let customAnchor = null
@ -102,7 +104,7 @@
bind:this={button} bind:this={button}
> >
{#if fieldIcon} {#if fieldIcon}
{#if !useOptionIconImage}x {#if !useOptionIconImage}
<span class="option-extra icon"> <span class="option-extra icon">
<Icon size="S" name={fieldIcon} /> <Icon size="S" name={fieldIcon} />
</span> </span>
@ -150,7 +152,9 @@
on:close={() => (open = false)} on:close={() => (open = false)}
useAnchorWidth={!autoWidth} useAnchorWidth={!autoWidth}
maxWidth={autoWidth ? 400 : null} maxWidth={autoWidth ? 400 : null}
maxHeight={customPopoverMaxHeight}
customHeight={customPopoverHeight} customHeight={customPopoverHeight}
offsetBelow={customPopoverOffsetBelow}
> >
<div <div
class="popover-content" class="popover-content"

View File

@ -21,10 +21,12 @@
export let sort = false export let sort = false
export let align export let align
export let footer = null export let footer = null
export let open = false
export let tag = null export let tag = null
const dispatch = createEventDispatcher() export let customPopoverOffsetBelow
export let customPopoverMaxHeight
let open = false const dispatch = createEventDispatcher()
$: fieldText = getFieldText(value, options, placeholder) $: fieldText = getFieldText(value, options, placeholder)
$: fieldIcon = getFieldAttribute(getOptionIcon, value, options) $: fieldIcon = getFieldAttribute(getOptionIcon, value, options)
@ -84,6 +86,8 @@
{autocomplete} {autocomplete}
{sort} {sort}
{tag} {tag}
{customPopoverOffsetBelow}
{customPopoverMaxHeight}
isPlaceholder={value == null || value === ""} isPlaceholder={value == null || value === ""}
placeholderOption={placeholder === false ? null : placeholder} placeholderOption={placeholder === false ? null : placeholder}
isOptionSelected={option => option === value} isOptionSelected={option => option === value}

View File

@ -19,6 +19,7 @@
export let useAnchorWidth = false export let useAnchorWidth = false
export let dismissible = true export let dismissible = true
export let offset = 5 export let offset = 5
export let offsetBelow
export let customHeight export let customHeight
export let animate = true export let animate = true
export let customZindex export let customZindex
@ -89,6 +90,7 @@
maxWidth, maxWidth,
useAnchorWidth, useAnchorWidth,
offset, offset,
offsetBelow,
customUpdate: handlePostionUpdate, customUpdate: handlePostionUpdate,
}} }}
use:clickOutside={{ use:clickOutside={{

View File

@ -955,7 +955,9 @@ export const buildFormSchema = (component, asset) => {
const patched = convertOldFieldFormat(component.fields || []) const patched = convertOldFieldFormat(component.fields || [])
patched?.forEach(({ field, active }) => { patched?.forEach(({ field, active }) => {
if (!active) return if (!active) return
schema[field] = { type: info?.schema[field].type } if (info?.schema[field]) {
schema[field] = { type: info?.schema[field].type }
}
}) })
} }

View File

@ -627,6 +627,7 @@ export const getFrontendStore = () => {
component[setting.key] = { component[setting.key] = {
label: defaultDS.name, label: defaultDS.name,
tableId: defaultDS._id, tableId: defaultDS._id,
resourceId: defaultDS._id,
type: "table", type: "table",
} }
} else if (setting.type === "dataProvider") { } else if (setting.type === "dataProvider") {
@ -1245,6 +1246,13 @@ export const getFrontendStore = () => {
const settings = getComponentSettings(component._component) const settings = getComponentSettings(component._component)
const updatedSetting = settings.find(setting => setting.key === name) const updatedSetting = settings.find(setting => setting.key === name)
const resetFields = settings.filter(
setting => name === setting.resetOn
)
resetFields?.forEach(setting => {
component[setting.key] = null
})
if ( if (
updatedSetting?.type === "dataSource" || updatedSetting?.type === "dataSource" ||
updatedSetting?.type === "table" updatedSetting?.type === "table"

View File

@ -8,7 +8,7 @@ export default function (datasources) {
} }
return datasources.map(datasource => { return datasources.map(datasource => {
return { return {
name: `${datasource.name} - List`, name: `${datasource.label} - List`,
create: () => createScreen(datasource), create: () => createScreen(datasource),
id: ROW_LIST_TEMPLATE, id: ROW_LIST_TEMPLATE,
resourceId: datasource.resourceId, resourceId: datasource.resourceId,
@ -17,13 +17,13 @@ export default function (datasources) {
} }
export const ROW_LIST_TEMPLATE = "ROW_LIST_TEMPLATE" export const ROW_LIST_TEMPLATE = "ROW_LIST_TEMPLATE"
export const rowListUrl = datasource => sanitizeUrl(`/${datasource.name}`) export const rowListUrl = datasource => sanitizeUrl(`/${datasource.label}`)
const generateTableBlock = datasource => { const generateTableBlock = datasource => {
const tableBlock = new Component("@budibase/standard-components/tableblock") const tableBlock = new Component("@budibase/standard-components/tableblock")
tableBlock tableBlock
.customProps({ .customProps({
title: datasource.name, title: datasource.label,
dataSource: datasource, dataSource: datasource,
sortOrder: "Ascending", sortOrder: "Ascending",
size: "spectrum--medium", size: "spectrum--medium",
@ -34,14 +34,14 @@ const generateTableBlock = datasource => {
titleButtonText: "Create row", titleButtonText: "Create row",
titleButtonClickBehaviour: "new", titleButtonClickBehaviour: "new",
}) })
.instanceName(`${datasource.name} - Table block`) .instanceName(`${datasource.label} - Table block`)
return tableBlock return tableBlock
} }
const createScreen = datasource => { const createScreen = datasource => {
return new Screen() return new Screen()
.route(rowListUrl(datasource)) .route(rowListUrl(datasource))
.instanceName(`${datasource.name} - List`) .instanceName(`${datasource.label} - List`)
.addChild(generateTableBlock(datasource)) .addChild(generateTableBlock(datasource))
.json() .json()
} }

View File

@ -10,6 +10,7 @@
import ManageAccessButton from "./buttons/ManageAccessButton.svelte" import ManageAccessButton from "./buttons/ManageAccessButton.svelte"
import HideAutocolumnButton from "./buttons/HideAutocolumnButton.svelte" import HideAutocolumnButton from "./buttons/HideAutocolumnButton.svelte"
import { notifications } from "@budibase/bbui" import { notifications } from "@budibase/bbui"
import { ROW_EXPORT_FORMATS } from "constants/backend"
export let view = {} export let view = {}
@ -19,6 +20,14 @@
let type = "internal" let type = "internal"
$: name = view.name $: name = view.name
$: calculation = view.calculation
$: supportedFormats = Object.values(ROW_EXPORT_FORMATS).filter(key => {
if (calculation && key === ROW_EXPORT_FORMATS.JSON_WITH_SCHEMA) {
return false
}
return true
})
// Fetch rows for specified view // Fetch rows for specified view
$: fetchViewData(name, view.field, view.groupBy, view.calculation) $: fetchViewData(name, view.field, view.groupBy, view.calculation)
@ -68,5 +77,5 @@
{/if} {/if}
<ManageAccessButton resourceId={decodeURI(name)} /> <ManageAccessButton resourceId={decodeURI(name)} />
<HideAutocolumnButton bind:hideAutocolumns /> <HideAutocolumnButton bind:hideAutocolumns />
<ExportButton view={view.name} /> <ExportButton view={view.name} formats={supportedFormats} />
</Table> </Table>

View File

@ -7,6 +7,7 @@
export let sorting export let sorting
export let disabled = false export let disabled = false
export let selectedRows export let selectedRows
export let formats
let modal let modal
</script> </script>
@ -15,5 +16,5 @@
Export Export
</ActionButton> </ActionButton>
<Modal bind:this={modal}> <Modal bind:this={modal}>
<ExportModal {view} {filters} {sorting} {selectedRows} /> <ExportModal {view} {filters} {sorting} {selectedRows} {formats} />
</Modal> </Modal>

View File

@ -9,30 +9,43 @@
import download from "downloadjs" import download from "downloadjs"
import { API } from "api" import { API } from "api"
import { Constants, LuceneUtils } from "@budibase/frontend-core" import { Constants, LuceneUtils } from "@budibase/frontend-core"
import { ROW_EXPORT_FORMATS } from "constants/backend"
const FORMATS = [
{
name: "CSV",
key: "csv",
},
{
name: "JSON",
key: "json",
},
{
name: "JSON with Schema",
key: "jsonWithSchema",
},
]
export let view export let view
export let filters export let filters
export let sorting export let sorting
export let selectedRows = [] export let selectedRows = []
export let formats
let exportFormat = FORMATS[0].key const FORMATS = [
{
name: "CSV",
key: ROW_EXPORT_FORMATS.CSV,
},
{
name: "JSON",
key: ROW_EXPORT_FORMATS.JSON,
},
{
name: "JSON with Schema",
key: ROW_EXPORT_FORMATS.JSON_WITH_SCHEMA,
},
]
$: options = FORMATS.filter(format => {
if (formats && !formats.includes(format.key)) {
return false
}
return true
})
let exportFormat
let filterLookup let filterLookup
$: if (options && !exportFormat) {
exportFormat = Array.isArray(options) ? options[0]?.key : []
}
$: luceneFilter = LuceneUtils.buildLuceneQuery(filters) $: luceneFilter = LuceneUtils.buildLuceneQuery(filters)
$: exportOpDisplay = buildExportOpDisplay(sorting, filterDisplay, filters) $: exportOpDisplay = buildExportOpDisplay(sorting, filterDisplay, filters)
@ -190,7 +203,7 @@
<Select <Select
label="Format" label="Format"
bind:value={exportFormat} bind:value={exportFormat}
options={FORMATS} {options}
placeholder={null} placeholder={null}
getOptionLabel={x => x.name} getOptionLabel={x => x.name}
getOptionValue={x => x.key} getOptionValue={x => x.key}

View File

@ -290,11 +290,11 @@
datasource.entities[getTable(toId).name].schema[toRelationship.name] = datasource.entities[getTable(toId).name].schema[toRelationship.name] =
toRelationship toRelationship
await save() await save({ action: "saved" })
} }
async function deleteRelationship() { async function deleteRelationship() {
removeExistingRelationship() removeExistingRelationship()
await save() await save({ action: "deleted" })
await tables.fetch() await tables.fetch()
close() close()
} }

View File

@ -33,7 +33,7 @@
} }
// action is one of 'created', 'updated' or 'deleted' // action is one of 'created', 'updated' or 'deleted'
async function saveRelationship(action) { async function saveRelationship({ action }) {
try { try {
await beforeSave({ action, datasource }) await beforeSave({ action, datasource })

View File

@ -21,6 +21,9 @@
let fieldList let fieldList
let schema let schema
let cachedValue let cachedValue
let options
let sanitisedValue
let unconfigured
$: bindings = getBindableProperties($selectedScreen, componentInstance._id) $: bindings = getBindableProperties($selectedScreen, componentInstance._id)
$: actionType = componentInstance.actionType $: actionType = componentInstance.actionType
@ -34,16 +37,24 @@
} }
$: datasource = getDatasourceForProvider($currentAsset, componentInstance) $: datasource = getDatasourceForProvider($currentAsset, componentInstance)
$: resourceId = datasource.resourceId || datasource.tableId
$: if (!isEqual(value, cachedValue)) { $: if (!isEqual(value, cachedValue)) {
cachedValue = value cachedValue = cloneDeep(value)
schema = getSchema($currentAsset, datasource)
} }
$: options = Object.keys(schema || {}) const updateState = value => {
$: sanitisedValue = getValidColumns(convertOldFieldFormat(value), options) schema = getSchema($currentAsset, datasource)
$: updateSanitsedFields(sanitisedValue) options = Object.keys(schema || {})
$: unconfigured = buildUnconfiguredOptions(schema, sanitisedFields) sanitisedValue = getValidColumns(convertOldFieldFormat(value), options)
updateSanitsedFields(sanitisedValue)
unconfigured = buildUnconfiguredOptions(schema, sanitisedFields)
fieldList = [...sanitisedFields, ...unconfigured]
.map(buildSudoInstance)
.filter(x => x != null)
}
$: updateState(cachedValue, resourceId)
// Builds unused ones only // Builds unused ones only
const buildUnconfiguredOptions = (schema, selected) => { const buildUnconfiguredOptions = (schema, selected) => {
@ -97,7 +108,6 @@
if (instance._component) { if (instance._component) {
return instance return instance
} }
const type = getComponentForField(instance.field, schema) const type = getComponentForField(instance.field, schema)
if (!type) { if (!type) {
return null return null
@ -118,12 +128,6 @@
return { ...instance, ...pseudoComponentInstance } return { ...instance, ...pseudoComponentInstance }
} }
$: if (sanitisedFields) {
fieldList = [...sanitisedFields, ...unconfigured]
.map(buildSudoInstance)
.filter(x => x != null)
}
const processItemUpdate = e => { const processItemUpdate = e => {
const updatedField = e.detail const updatedField = e.detail
const parentFieldsUpdated = fieldList ? cloneDeep(fieldList) : [] const parentFieldsUpdated = fieldList ? cloneDeep(fieldList) : []

View File

@ -8,15 +8,16 @@
const dispatch = createEventDispatcher() const dispatch = createEventDispatcher()
$: tables = $tablesStore.list.map(table => ({ $: tables = $tablesStore.list.map(table => ({
...table,
type: "table", type: "table",
label: table.name, label: table.name,
tableId: table._id,
resourceId: table._id, resourceId: table._id,
})) }))
$: views = $viewsV2.list.map(view => ({ $: views = $viewsV2.list.map(view => ({
...view,
type: "viewV2", type: "viewV2",
id: view.id,
label: view.name, label: view.name,
tableId: view.tableId,
resourceId: view.id, resourceId: view.id,
})) }))
$: options = [...(tables || []), ...(views || [])] $: options = [...(tables || []), ...(views || [])]
@ -32,7 +33,7 @@
// Migrate old values before "resourceId" existed // Migrate old values before "resourceId" existed
if (value && !value.resourceId) { if (value && !value.resourceId) {
const view = views.find(x => x.resourceId === value.id) const view = views.find(x => x.resourceId === value.id)
const table = tables.find(x => x.resourceId === value._id) const table = tables.find(x => x.resourceId === value.tableId)
dispatch("change", view || table) dispatch("change", view || table)
} }
}) })

View File

@ -287,3 +287,9 @@ export const DatasourceTypes = {
GRAPH: "Graph", GRAPH: "Graph",
API: "API", API: "API",
} }
export const ROW_EXPORT_FORMATS = {
CSV: "csv",
JSON: "json",
JSON_WITH_SCHEMA: "jsonWithSchema",
}

View File

@ -120,7 +120,7 @@
await usersFetch.refresh() await usersFetch.refresh()
filteredUsers = $usersFetch.rows filteredUsers = $usersFetch.rows
.filter(user => !user?.admin?.global) // filter out global admins .filter(user => user.email !== $auth.user.email)
.map(user => { .map(user => {
const isAdminOrGlobalBuilder = sdk.users.isAdminOrGlobalBuilder( const isAdminOrGlobalBuilder = sdk.users.isAdminOrGlobalBuilder(
user, user,
@ -150,13 +150,10 @@
} }
const sortInviteRoles = (a, b) => { const sortInviteRoles = (a, b) => {
const aEmpty = const aAppsEmpty = !a.info?.apps?.length && !a.info?.builder?.apps?.length
!a.info?.appBuilders?.length && Object.keys(a.info.apps).length === 0 const bAppsEmpty = !b.info?.apps?.length && !b.info?.builder?.apps?.length
const bEmpty =
!b.info?.appBuilders?.length && Object.keys(b.info.apps).length === 0
if (aEmpty && !bEmpty) return 1 return aAppsEmpty && !bAppsEmpty ? 1 : !aAppsEmpty && bAppsEmpty ? -1 : 0
if (!aEmpty && bEmpty) return -1
} }
const sortRoles = (a, b) => { const sortRoles = (a, b) => {
@ -366,18 +363,19 @@
const payload = [ const payload = [
{ {
email: newUserEmail, email: newUserEmail,
builder: !!creationRoleType === Constants.BudibaseRoles.Admin, builder: { global: creationRoleType === Constants.BudibaseRoles.Admin },
admin: !!creationRoleType === Constants.BudibaseRoles.Admin, admin: { global: creationRoleType === Constants.BudibaseRoles.Admin },
}, },
] ]
if (creationAccessType === Constants.Roles.CREATOR) { const notCreatingAdmin = creationRoleType !== Constants.BudibaseRoles.Admin
payload[0].appBuilders = [prodAppId] const isCreator = creationAccessType === Constants.Roles.CREATOR
} else { if (notCreatingAdmin && isCreator) {
payload[0].apps = { payload[0].builder.apps = [prodAppId]
[prodAppId]: creationAccessType, } else if (notCreatingAdmin && !isCreator) {
} payload[0].apps = { [prodAppId]: creationAccessType }
} }
let userInviteResponse let userInviteResponse
try { try {
userInviteResponse = await users.onboard(payload) userInviteResponse = await users.onboard(payload)
@ -438,10 +436,11 @@
} }
if (role === Constants.Roles.CREATOR) { if (role === Constants.Roles.CREATOR) {
updateBody.appBuilders = [...(updateBody.appBuilders ?? []), prodAppId] updateBody.builder = updateBody.builder || {}
updateBody.builder.apps = [...(updateBody.builder.apps ?? []), prodAppId]
delete updateBody?.apps?.[prodAppId] delete updateBody?.apps?.[prodAppId]
} else if (role !== Constants.Roles.CREATOR && invite?.appBuilders) { } else if (role !== Constants.Roles.CREATOR && invite?.builder?.apps) {
invite.appBuilders = [] invite.builder.apps = []
} }
await users.updateInvite(updateBody) await users.updateInvite(updateBody)
await filterInvites(query) await filterInvites(query)
@ -494,6 +493,18 @@
} }
} }
const getInviteRoleValue = invite => {
if (invite.info?.admin?.global && invite.info?.builder?.global) {
return Constants.Roles.ADMIN
}
if (invite.info?.builder?.apps?.includes(prodAppId)) {
return Constants.Roles.CREATOR
}
return invite.info.apps?.[prodAppId]
}
const getRoleFooter = user => { const getRoleFooter = user => {
if (user.group) { if (user.group) {
const role = $roles.find(role => role._id === user.role) const role = $roles.find(role => role._id === user.role)
@ -531,7 +542,9 @@
<Heading size="S">{invitingFlow ? "Invite new user" : "Users"}</Heading> <Heading size="S">{invitingFlow ? "Invite new user" : "Users"}</Heading>
</div> </div>
<div class="header"> <div class="header">
<Button on:click={openInviteFlow} size="S" cta>Invite user</Button> {#if !invitingFlow}
<Button on:click={openInviteFlow} size="S" cta>Invite user</Button>
{/if}
<Icon <Icon
color="var(--spectrum-global-color-gray-600)" color="var(--spectrum-global-color-gray-600)"
name="RailRightClose" name="RailRightClose"
@ -600,6 +613,11 @@
<div class="auth-entity-access-title">Access</div> <div class="auth-entity-access-title">Access</div>
</div> </div>
{#each filteredInvites as invite} {#each filteredInvites as invite}
{@const user = {
isAdminOrGlobalBuilder:
invite.info?.admin?.global && invite.info?.builder?.global,
}}
<div class="auth-entity"> <div class="auth-entity">
<div class="details"> <div class="details">
<div class="user-email" title={invite.email}> <div class="user-email" title={invite.email}>
@ -608,10 +626,9 @@
</div> </div>
<div class="auth-entity-access"> <div class="auth-entity-access">
<RoleSelect <RoleSelect
footer={getRoleFooter(user)}
placeholder={false} placeholder={false}
value={invite.info?.appBuilders?.includes(prodAppId) value={getInviteRoleValue(invite)}
? Constants.Roles.CREATOR
: invite.info.apps?.[prodAppId]}
allowRemove={invite.info.apps?.[prodAppId]} allowRemove={invite.info.apps?.[prodAppId]}
allowPublic={false} allowPublic={false}
allowCreator={true} allowCreator={true}
@ -624,6 +641,9 @@
}} }}
autoWidth autoWidth
align="right" align="right"
allowedRoles={user.isAdminOrGlobalBuilder
? [Constants.Roles.ADMIN]
: null}
/> />
</div> </div>
</div> </div>

View File

@ -75,43 +75,37 @@
{@const views = Object.values(table.views || {}).filter( {@const views = Object.values(table.views || {}).filter(
view => view.version === 2 view => view.version === 2
)} )}
{@const datasource = { {@const tableDS = {
...table,
// Legacy properties
tableId: table._id, tableId: table._id,
label: table.name, label: table.name,
// New consistent properties
resourceId: table._id, resourceId: table._id,
name: table.name,
type: "table", type: "table",
}} }}
{@const selected = selectedScreens.find( {@const selected = selectedScreens.find(
screen => screen.resourceId === datasource.resourceId screen => screen.resourceId === tableDS.resourceId
)} )}
<DatasourceTemplateRow <DatasourceTemplateRow
on:click={() => toggleSelection(datasource)} on:click={() => toggleSelection(tableDS)}
{selected} {selected}
{datasource} datasource={tableDS}
/> />
<!-- List all views inside this table --> <!-- List all views inside this table -->
{#each views as view} {#each views as view}
{@const datasource = { {@const viewDS = {
...view,
// Legacy properties
label: view.name, label: view.name,
// New consistent properties id: view.id,
resourceId: view.id, resourceId: view.id,
name: view.name, tableId: view.tableId,
type: "viewV2", type: "viewV2",
}} }}
{@const selected = selectedScreens.find( {@const selected = selectedScreens.find(
x => x.resourceId === datasource.resourceId x => x.resourceId === viewDS.resourceId
)} )}
<DatasourceTemplateRow <DatasourceTemplateRow
on:click={() => toggleSelection(datasource)} on:click={() => toggleSelection(viewDS)}
{selected} {selected}
{datasource} datasource={viewDS}
/> />
{/each} {/each}
{/each} {/each}

View File

@ -8,7 +8,7 @@
<div class="data-source-entry" class:selected on:click> <div class="data-source-entry" class:selected on:click>
<Icon name={icon} color="var(--spectrum-global-color-gray-600)" /> <Icon name={icon} color="var(--spectrum-global-color-gray-600)" />
{datasource.name} {datasource.label}
{#if selected} {#if selected}
<span class="data-source-check"> <span class="data-source-check">
<Icon size="S" name="CheckmarkCircle" /> <Icon size="S" name="CheckmarkCircle" />

View File

@ -46,7 +46,7 @@
let loaded = false let loaded = false
let editModal, deleteModal let editModal, deleteModal
$: console.log(group)
$: scimEnabled = $features.isScimEnabled $: scimEnabled = $features.isScimEnabled
$: readonly = !sdk.users.isAdmin($auth.user) || scimEnabled $: readonly = !sdk.users.isAdmin($auth.user) || scimEnabled
$: group = $groups.find(x => x._id === groupId) $: group = $groups.find(x => x._id === groupId)
@ -62,7 +62,7 @@
? Constants.Roles.CREATOR ? Constants.Roles.CREATOR
: group?.roles?.[apps.getProdAppID(app.devId)], : group?.roles?.[apps.getProdAppID(app.devId)],
})) }))
$: console.log(groupApps)
$: { $: {
if (loaded && !group?._id) { if (loaded && !group?._id) {
$goto("./") $goto("./")

View File

@ -5,7 +5,6 @@
export let value export let value
export let row export let row
$: console.log(row)
$: priviliged = sdk.users.isAdminOrBuilder(row) $: priviliged = sdk.users.isAdminOrBuilder(row)
$: count = getCount(row) $: count = getCount(row)
@ -14,10 +13,10 @@
return $apps.length return $apps.length
} else { } else {
return sdk.users.hasAppBuilderPermissions(row) return sdk.users.hasAppBuilderPermissions(row)
? row.builder.apps.length + ? row?.builder?.apps?.length +
Object.keys(row.roles || {}).filter(appId => Object.keys(row.roles || {}).filter(appId => {
row.builder.apps.includes(appId) row?.builder?.apps?.includes(appId)
).length }).length
: value?.length || 0 : value?.length || 0
} }
} }

View File

@ -10,7 +10,7 @@
admin: "Full access", admin: "Full access",
} }
$: role = Constants.BudibaseRoleOptions.find( $: role = Constants.BudibaseRoleOptionsOld.find(
x => x.value === users.getUserRole(row) x => x.value === users.getUserRole(row)
) )
$: value = role?.label || "Not available" $: value = role?.label || "Not available"

View File

@ -121,8 +121,11 @@ export function createUsersStore() {
} }
const getUserRole = user => const getUserRole = user =>
sdk.users.isAdminOrGlobalBuilder(user) ? "admin" : "appUser" sdk.users.isAdmin(user)
? "admin"
: sdk.users.isBuilder(user)
? "developer"
: "appUser"
const refreshUsage = const refreshUsage =
fn => fn =>
async (...args) => { async (...args) => {

View File

@ -3647,9 +3647,9 @@
}, },
{ {
"type": "boolean", "type": "boolean",
"label": "Autocomplete", "label": "Search",
"key": "autocomplete", "key": "autocomplete",
"defaultValue": false "defaultValue": true
}, },
{ {
"type": "boolean", "type": "boolean",
@ -4745,7 +4745,8 @@
"dependsOn": { "dependsOn": {
"setting": "clickBehaviour", "setting": "clickBehaviour",
"value": "details" "value": "details"
} },
"resetOn": "dataSource"
}, },
{ {
"label": "Save button", "label": "Save button",
@ -5397,6 +5398,7 @@
"type": "fieldConfiguration", "type": "fieldConfiguration",
"key": "fields", "key": "fields",
"nested": true, "nested": true,
"resetOn": "dataSource",
"selectAllFields": true "selectAllFields": true
}, },
{ {

View File

@ -275,7 +275,7 @@
dataSource, dataSource,
showSaveButton: true, showSaveButton: true,
showDeleteButton: false, showDeleteButton: false,
saveButtonLabel: sidePanelSaveLabel, saveButtonLabel: sidePanelSaveLabel || "Save", //always show
actionType: "Create", actionType: "Create",
fields: sidePanelFields || normalFields, fields: sidePanelFields || normalFields,
title: "Create Row", title: "Create Row",

View File

@ -211,17 +211,19 @@
{/if} {/if}
</BlockComponent> </BlockComponent>
{/if} {/if}
<BlockComponent type="fieldgroup" props={{ labelPosition }} order={1}> {#key fields}
{#each fields as field, idx} <BlockComponent type="fieldgroup" props={{ labelPosition }} order={1}>
{#if getComponentForField(field) && field.active} {#each fields as field, idx}
<BlockComponent {#if getComponentForField(field) && field.active}
type={getComponentForField(field)} <BlockComponent
props={getPropsForField(field)} type={getComponentForField(field)}
order={idx} props={getPropsForField(field)}
/> order={idx}
{/if} />
{/each} {/if}
</BlockComponent> {/each}
</BlockComponent>
{/key}
</BlockComponent> </BlockComponent>
</BlockComponent> </BlockComponent>
{:else} {:else}

View File

@ -136,7 +136,7 @@
// Check arrays - remove any values not present in the field schema and // Check arrays - remove any values not present in the field schema and
// convert any values supplied to strings // convert any values supplied to strings
if (Array.isArray(value) && type === "array" && schema) { if (Array.isArray(value) && type === "array" && schema) {
const options = schema?.constraints.inclusion || [] const options = schema?.constraints?.inclusion || []
return value.map(opt => String(opt)).filter(opt => options.includes(opt)) return value.map(opt => String(opt)).filter(opt => options.includes(opt))
} }
return value return value

View File

@ -1,6 +1,11 @@
<script> <script>
import { CoreSelect, CoreMultiselect } from "@budibase/bbui" import {
import { fetchData } from "@budibase/frontend-core" CoreSelect,
CoreMultiselect,
Input,
ProgressCircle,
} from "@budibase/bbui"
import { fetchData, Utils } from "@budibase/frontend-core"
import { getContext } from "svelte" import { getContext } from "svelte"
import Field from "./Field.svelte" import Field from "./Field.svelte"
import { FieldTypes } from "../../../constants" import { FieldTypes } from "../../../constants"
@ -12,7 +17,7 @@
export let placeholder export let placeholder
export let disabled = false export let disabled = false
export let validation export let validation
export let autocomplete = false export let autocomplete = true
export let defaultValue export let defaultValue
export let onChange export let onChange
export let filter export let filter
@ -21,6 +26,16 @@
let fieldApi let fieldApi
let fieldSchema let fieldSchema
let tableDefinition let tableDefinition
let primaryDisplay
let options
let selectedOptions = []
let isOpen = false
let hasFilter
let searchResults
let searchString
let searching = false
let lastSearchId
$: multiselect = fieldSchema?.relationshipType !== "one-to-many" $: multiselect = fieldSchema?.relationshipType !== "one-to-many"
$: linkedTableId = fieldSchema?.tableId $: linkedTableId = fieldSchema?.tableId
@ -35,13 +50,57 @@
limit: 100, limit: 100,
}, },
}) })
$: hasFilter = !!filter?.filter(f => !!f.field)?.length
$: fetch.update({ filter }) $: fetch.update({ filter })
$: options = $fetch.rows $: {
options = searchResults ? searchResults : $fetch.rows
const nonMatchingOptions = selectedOptions.filter(
option => !options.map(opt => opt._id).includes(option._id)
)
// Append initially selected options if there is no filter
// and hasn't already been appended
if (!hasFilter) {
options = [...options, ...nonMatchingOptions]
}
}
$: tableDefinition = $fetch.definition $: tableDefinition = $fetch.definition
$: primaryDisplay = tableDefinition?.primaryDisplay || "_id"
$: singleValue = flatten(fieldState?.value)?.[0] $: singleValue = flatten(fieldState?.value)?.[0]
$: multiValue = flatten(fieldState?.value) ?? [] $: multiValue = flatten(fieldState?.value) ?? []
$: component = multiselect ? CoreMultiselect : CoreSelect $: component = multiselect ? CoreMultiselect : CoreSelect
$: expandedDefaultValue = expand(defaultValue) $: expandedDefaultValue = expand(defaultValue)
$: debouncedSearch(searchString)
$: {
if (searching) {
isOpen = true
}
}
// Fetch the initially selected values
// as they may not be within the first 100 records
$: {
if (
primaryDisplay !== "_id" &&
fieldState?.value?.length &&
!selectedOptions?.length
) {
API.searchTable({
paginate: false,
tableId: linkedTableId,
limit: 100,
query: {
oneOf: {
[`1:${primaryDisplay}`]: fieldState?.value?.map(
value => value.primaryDisplay
),
},
},
}).then(response => {
const value = multiselect ? multiValue : singleValue
selectedOptions = response.rows.filter(row => value.includes(row._id))
})
}
}
const flatten = values => { const flatten = values => {
if (!values) { if (!values) {
@ -77,10 +136,66 @@
const handleChange = value => { const handleChange = value => {
const changed = fieldApi.setValue(value) const changed = fieldApi.setValue(value)
selectedOptions = value.map(val => ({
_id: val,
[primaryDisplay]: options.find(option => option._id === val)[
primaryDisplay
],
}))
if (onChange && changed) { if (onChange && changed) {
onChange({ value }) onChange({ value })
} }
} }
// Search for rows based on the search string
const search = async searchString => {
// Reset state if this search is invalid
if (!linkedTableId || !searchString) {
searchResults = null
return
}
// If a filter exists, then do a client side search
if (hasFilter) {
searchResults = $fetch.rows.filter(option =>
option[primaryDisplay].startsWith(searchString)
)
isOpen = true
return
}
// Search for results, using IDs to track invocations and ensure we're
// handling the latest update
lastSearchId = Math.random()
searching = true
const thisSearchId = lastSearchId
const results = await API.searchTable({
paginate: false,
tableId: linkedTableId,
limit: 100,
query: {
string: {
[`1:${primaryDisplay}`]: searchString || "",
},
},
})
searching = false
// In case searching takes longer than our debounced update, abandon these
// results
if (thisSearchId !== lastSearchId) {
return
}
// Process results
searchResults = results.rows?.map(row => ({
...row,
primaryDisplay: row[primaryDisplay],
}))
}
// Debounced version of searching
const debouncedSearch = Utils.debounce(search, 250)
</script> </script>
<Field <Field
@ -95,19 +210,63 @@
bind:fieldSchema bind:fieldSchema
> >
{#if fieldState} {#if fieldState}
<svelte:component <div class={autocomplete ? "field-with-search" : ""}>
this={component} <svelte:component
{options} this={component}
{autocomplete} bind:open={isOpen}
value={multiselect ? multiValue : singleValue} {options}
on:change={multiselect ? multiHandler : singleHandler} autocomplete={false}
id={fieldState.fieldId} value={multiselect ? multiValue : singleValue}
disabled={fieldState.disabled} on:change={multiselect ? multiHandler : singleHandler}
error={fieldState.error} id={fieldState.fieldId}
getOptionLabel={getDisplayName} disabled={fieldState.disabled}
getOptionValue={option => option._id} error={fieldState.error}
{placeholder} getOptionLabel={getDisplayName}
sort={true} getOptionValue={option => option._id}
/> {placeholder}
customPopoverOffsetBelow={autocomplete ? 32 : null}
customPopoverMaxHeight={autocomplete ? 240 : null}
sort={true}
/>
{#if autocomplete}
<div class="search">
<Input
autofocus
quiet
type="text"
bind:value={searchString}
placeholder={primaryDisplay ? `Search by ${primaryDisplay}` : null}
/>
{#if searching}
<div>
<ProgressCircle size="S" />
</div>
{/if}
</div>
{/if}
</div>
{/if} {/if}
</Field> </Field>
<style>
.search {
flex: 0 0 calc(var(--default-row-height) - 1px);
display: flex;
align-items: center;
margin: 4px var(--cell-padding);
width: calc(100% - 2 * var(--cell-padding));
}
.search :global(.spectrum-Textfield) {
min-width: 0;
width: 100%;
}
.search :global(.spectrum-Textfield-input) {
font-size: 13px;
}
.search :global(.spectrum-Form-item) {
flex: 1 1 auto;
}
.field-with-search {
min-height: 80px;
}
</style>

View File

@ -144,8 +144,8 @@ export const buildUserEndpoints = API => ({
body: { body: {
email, email,
userInfo: { userInfo: {
admin: admin ? { global: true } : undefined, admin: admin?.global ? { global: true } : undefined,
builder: builder ? { global: true } : undefined, builder: builder?.global ? { global: true } : undefined,
apps: apps ? apps : undefined, apps: apps ? apps : undefined,
}, },
}, },
@ -156,14 +156,13 @@ export const buildUserEndpoints = API => ({
return await API.post({ return await API.post({
url: "/api/global/users/onboard", url: "/api/global/users/onboard",
body: payload.map(invite => { body: payload.map(invite => {
const { email, admin, builder, apps, appBuilders } = invite const { email, admin, builder, apps } = invite
return { return {
email, email,
userInfo: { userInfo: {
admin: admin ? { global: true } : undefined, admin,
builder: builder ? { global: true } : undefined, builder,
apps: apps ? apps : undefined, apps: apps ? apps : undefined,
appBuilders,
}, },
} }
}), }),
@ -176,12 +175,11 @@ export const buildUserEndpoints = API => ({
* @param invite the invite code sent in the email * @param invite the invite code sent in the email
*/ */
updateUserInvite: async invite => { updateUserInvite: async invite => {
console.log(invite)
await API.post({ await API.post({
url: `/api/global/users/invite/update/${invite.code}`, url: `/api/global/users/invite/update/${invite.code}`,
body: { body: {
apps: invite.apps, apps: invite.apps,
appBuilders: invite.appBuilders, builder: invite.builder,
}, },
}) })
}, },

View File

@ -23,6 +23,11 @@ export const BudibaseRoles = {
Admin: "admin", Admin: "admin",
} }
export const BudibaseRoleOptionsOld = [
{ label: "Developer", value: BudibaseRoles.Developer },
{ label: "Member", value: BudibaseRoles.AppUser },
{ label: "Admin", value: BudibaseRoles.Admin },
]
export const BudibaseRoleOptions = [ export const BudibaseRoleOptions = [
{ label: "Member", value: BudibaseRoles.AppUser }, { label: "Member", value: BudibaseRoles.AppUser },
{ label: "Admin", value: BudibaseRoles.Admin }, { label: "Admin", value: BudibaseRoles.Admin },

View File

@ -1,5 +1,6 @@
export { createAPIClient } from "./api" export { createAPIClient } from "./api"
export { fetchData } from "./fetch/fetchData" export { fetchData } from "./fetch/fetchData"
export { Utils } from "./utils"
export * as Constants from "./constants" export * as Constants from "./constants"
export * from "./stores" export * from "./stores"
export * from "./utils" export * from "./utils"

View File

@ -39,8 +39,9 @@ import {
} from "../../db/defaultData/datasource_bb_default" } from "../../db/defaultData/datasource_bb_default"
import { removeAppFromUserRoles } from "../../utilities/workerRequests" import { removeAppFromUserRoles } from "../../utilities/workerRequests"
import { stringToReadStream } from "../../utilities" import { stringToReadStream } from "../../utilities"
import { doesUserHaveLock } from "../../utilities/redis" import { doesUserHaveLock, getLocksById } from "../../utilities/redis"
import { cleanupAutomations } from "../../automations/utils" import { cleanupAutomations } from "../../automations/utils"
import { checkAppMetadata } from "../../automations/logging"
import { getUniqueRows } from "../../utilities/usageQuota/rows" import { getUniqueRows } from "../../utilities/usageQuota/rows"
import { groups, licensing, quotas } from "@budibase/pro" import { groups, licensing, quotas } from "@budibase/pro"
import { import {
@ -50,6 +51,7 @@ import {
PlanType, PlanType,
Screen, Screen,
UserCtx, UserCtx,
ContextUser,
} from "@budibase/types" } from "@budibase/types"
import { BASE_LAYOUT_PROP_IDS } from "../../constants/layouts" import { BASE_LAYOUT_PROP_IDS } from "../../constants/layouts"
import sdk from "../../sdk" import sdk from "../../sdk"

View File

@ -20,7 +20,7 @@ import {
Automation, Automation,
AutomationActionStepId, AutomationActionStepId,
AutomationResults, AutomationResults,
Ctx, BBContext,
} from "@budibase/types" } from "@budibase/types"
import { getActionDefinitions as actionDefs } from "../../automations/actions" import { getActionDefinitions as actionDefs } from "../../automations/actions"
import sdk from "../../sdk" import sdk from "../../sdk"
@ -73,7 +73,7 @@ function cleanAutomationInputs(automation: Automation) {
return automation return automation
} }
export async function create(ctx: Ctx) { export async function create(ctx: BBContext) {
const db = context.getAppDB() const db = context.getAppDB()
let automation = ctx.request.body let automation = ctx.request.body
automation.appId = ctx.appId automation.appId = ctx.appId
@ -142,7 +142,7 @@ export async function handleStepEvents(
} }
} }
export async function update(ctx: Ctx) { export async function update(ctx: BBContext) {
const db = context.getAppDB() const db = context.getAppDB()
let automation = ctx.request.body let automation = ctx.request.body
automation.appId = ctx.appId automation.appId = ctx.appId
@ -193,7 +193,7 @@ export async function update(ctx: Ctx) {
builderSocket?.emitAutomationUpdate(ctx, automation) builderSocket?.emitAutomationUpdate(ctx, automation)
} }
export async function fetch(ctx: Ctx) { export async function fetch(ctx: BBContext) {
const db = context.getAppDB() const db = context.getAppDB()
const response = await db.allDocs( const response = await db.allDocs(
getAutomationParams(null, { getAutomationParams(null, {
@ -203,11 +203,12 @@ export async function fetch(ctx: Ctx) {
ctx.body = response.rows.map(row => row.doc) ctx.body = response.rows.map(row => row.doc)
} }
export async function find(ctx: Ctx) { export async function find(ctx: BBContext) {
ctx.body = await sdk.automations.get(ctx.params.id) const db = context.getAppDB()
ctx.body = await db.get(ctx.params.id)
} }
export async function destroy(ctx: Ctx) { export async function destroy(ctx: BBContext) {
const db = context.getAppDB() const db = context.getAppDB()
const automationId = ctx.params.id const automationId = ctx.params.id
const oldAutomation = await db.get<Automation>(automationId) const oldAutomation = await db.get<Automation>(automationId)
@ -221,11 +222,11 @@ export async function destroy(ctx: Ctx) {
builderSocket?.emitAutomationDeletion(ctx, automationId) builderSocket?.emitAutomationDeletion(ctx, automationId)
} }
export async function logSearch(ctx: Ctx) { export async function logSearch(ctx: BBContext) {
ctx.body = await automations.logs.logSearch(ctx.request.body) ctx.body = await automations.logs.logSearch(ctx.request.body)
} }
export async function clearLogError(ctx: Ctx) { export async function clearLogError(ctx: BBContext) {
const { automationId, appId } = ctx.request.body const { automationId, appId } = ctx.request.body
await context.doInAppContext(appId, async () => { await context.doInAppContext(appId, async () => {
const db = context.getProdAppDB() const db = context.getProdAppDB()
@ -244,15 +245,15 @@ export async function clearLogError(ctx: Ctx) {
}) })
} }
export async function getActionList(ctx: Ctx) { export async function getActionList(ctx: BBContext) {
ctx.body = await getActionDefinitions() ctx.body = await getActionDefinitions()
} }
export async function getTriggerList(ctx: Ctx) { export async function getTriggerList(ctx: BBContext) {
ctx.body = getTriggerDefinitions() ctx.body = getTriggerDefinitions()
} }
export async function getDefinitionList(ctx: Ctx) { export async function getDefinitionList(ctx: BBContext) {
ctx.body = { ctx.body = {
trigger: getTriggerDefinitions(), trigger: getTriggerDefinitions(),
action: await getActionDefinitions(), action: await getActionDefinitions(),
@ -265,7 +266,7 @@ export async function getDefinitionList(ctx: Ctx) {
* * * *
*********************/ *********************/
export async function trigger(ctx: Ctx) { export async function trigger(ctx: BBContext) {
const db = context.getAppDB() const db = context.getAppDB()
let automation = await db.get<Automation>(ctx.params.id) let automation = await db.get<Automation>(ctx.params.id)
@ -310,7 +311,7 @@ function prepareTestInput(input: any) {
return input return input
} }
export async function test(ctx: Ctx) { export async function test(ctx: BBContext) {
const db = context.getAppDB() const db = context.getAppDB()
let automation = await db.get<Automation>(ctx.params.id) let automation = await db.get<Automation>(ctx.params.id)
await setTestFlag(automation._id!) await setTestFlag(automation._id!)

View File

@ -95,7 +95,7 @@ export async function fetchView(ctx: any) {
() => () =>
sdk.rows.fetchView(tableId, viewName, { sdk.rows.fetchView(tableId, viewName, {
calculation, calculation,
group, group: calculation ? group : null,
field, field,
}), }),
{ {

View File

@ -27,7 +27,7 @@ export function json(rows: Row[]) {
export function jsonWithSchema(schema: TableSchema, rows: Row[]) { export function jsonWithSchema(schema: TableSchema, rows: Row[]) {
const newSchema: TableSchema = {} const newSchema: TableSchema = {}
Object.values(schema).forEach(column => { Object.values(schema).forEach(column => {
if (!column.autocolumn) { if (!column.autocolumn && column.name) {
newSchema[column.name] = column newSchema[column.name] = column
} }
}) })

View File

@ -6,11 +6,11 @@ import { isDevAppID } from "../db/utils"
// need this to call directly, so we can get a response // need this to call directly, so we can get a response
import { automationQueue } from "./bullboard" import { automationQueue } from "./bullboard"
import { checkTestFlag } from "../utilities/redis" import { checkTestFlag } from "../utilities/redis"
import * as utils from "./utils"
import env from "../environment" import env from "../environment"
import { context, db as dbCore } from "@budibase/backend-core" import { context, db as dbCore } from "@budibase/backend-core"
import { Automation, Row, AutomationData, AutomationJob } from "@budibase/types" import { Automation, Row, AutomationData, AutomationJob } from "@budibase/types"
import { executeSynchronously } from "../threads/automation" import { executeSynchronously } from "../threads/automation"
import sdk from "../sdk"
export const TRIGGER_DEFINITIONS = definitions export const TRIGGER_DEFINITIONS = definitions
const JOB_OPTS = { const JOB_OPTS = {
@ -142,7 +142,7 @@ export async function rebootTrigger() {
let automations = await getAllAutomations() let automations = await getAllAutomations()
let rebootEvents = [] let rebootEvents = []
for (let automation of automations) { for (let automation of automations) {
if (sdk.automations.isReboot(automation)) { if (utils.isRebootTrigger(automation)) {
const job = { const job = {
automation, automation,
event: { event: {

View File

@ -17,17 +17,16 @@ import {
import sdk from "../sdk" import sdk from "../sdk"
import { automationsEnabled } from "../features" import { automationsEnabled } from "../features"
const REBOOT_CRON = "@reboot"
const WH_STEP_ID = definitions.WEBHOOK.stepId const WH_STEP_ID = definitions.WEBHOOK.stepId
const CRON_STEP_ID = definitions.CRON.stepId
let Runner: Thread let Runner: Thread
if (automationsEnabled()) { if (automationsEnabled()) {
Runner = new Thread(ThreadType.AUTOMATION) Runner = new Thread(ThreadType.AUTOMATION)
} }
function loggingArgs( function loggingArgs(job: AutomationJob) {
job: AutomationJob, return [
timing?: { start: number; complete?: boolean }
) {
const logs: any[] = [
{ {
_logKey: "automation", _logKey: "automation",
trigger: job.data.automation.definition.trigger.event, trigger: job.data.automation.definition.trigger.event,
@ -37,53 +36,24 @@ function loggingArgs(
jobId: job.id, jobId: job.id,
}, },
] ]
if (timing?.start) {
logs.push({
_logKey: "startTime",
start: timing.start,
})
}
if (timing?.start && timing?.complete) {
const end = new Date().getTime()
const duration = end - timing.start
logs.push({
_logKey: "endTime",
end,
})
logs.push({
_logKey: "duration",
duration,
})
}
return logs
} }
export async function processEvent(job: AutomationJob) { export async function processEvent(job: AutomationJob) {
const appId = job.data.event.appId! const appId = job.data.event.appId!
const automationId = job.data.automation._id! const automationId = job.data.automation._id!
const start = new Date().getTime()
const task = async () => { const task = async () => {
try { try {
// need to actually await these so that an error can be captured properly // need to actually await these so that an error can be captured properly
console.log("automation running", ...loggingArgs(job, { start })) console.log("automation running", ...loggingArgs(job))
const runFn = () => Runner.run(job) const runFn = () => Runner.run(job)
const result = await quotas.addAutomation(runFn, { const result = await quotas.addAutomation(runFn, {
automationId, automationId,
}) })
const end = new Date().getTime() console.log("automation completed", ...loggingArgs(job))
const duration = end - start
console.log(
"automation completed",
...loggingArgs(job, { start, complete: true })
)
return result return result
} catch (err) { } catch (err) {
console.error( console.error(`automation was unable to run`, err, ...loggingArgs(job))
`automation was unable to run`,
err,
...loggingArgs(job, { start, complete: true })
)
return { err } return { err }
} }
} }
@ -163,6 +133,19 @@ export async function clearMetadata() {
await db.bulkDocs(automationMetadata) await db.bulkDocs(automationMetadata)
} }
export function isCronTrigger(auto: Automation) {
return (
auto &&
auto.definition.trigger &&
auto.definition.trigger.stepId === CRON_STEP_ID
)
}
export function isRebootTrigger(auto: Automation) {
const trigger = auto ? auto.definition.trigger : null
return isCronTrigger(auto) && trigger?.inputs.cron === REBOOT_CRON
}
/** /**
* This function handles checking of any cron jobs that need to be enabled/updated. * This function handles checking of any cron jobs that need to be enabled/updated.
* @param {string} appId The ID of the app in which we are checking for webhooks * @param {string} appId The ID of the app in which we are checking for webhooks
@ -170,14 +153,14 @@ export async function clearMetadata() {
*/ */
export async function enableCronTrigger(appId: any, automation: Automation) { export async function enableCronTrigger(appId: any, automation: Automation) {
const trigger = automation ? automation.definition.trigger : null const trigger = automation ? automation.definition.trigger : null
const validCron = sdk.automations.isCron(automation) && trigger?.inputs.cron
const needsCreated =
!sdk.automations.isReboot(automation) &&
!sdk.automations.disabled(automation)
let enabled = false let enabled = false
// need to create cron job // need to create cron job
if (validCron && needsCreated) { if (
isCronTrigger(automation) &&
!isRebootTrigger(automation) &&
trigger?.inputs.cron
) {
// make a job id rather than letting Bull decide, makes it easier to handle on way out // make a job id rather than letting Bull decide, makes it easier to handle on way out
const jobId = `${appId}_cron_${newid()}` const jobId = `${appId}_cron_${newid()}`
const job: any = await automationQueue.add( const job: any = await automationQueue.add(

View File

@ -1,38 +0,0 @@
import { context } from "@budibase/backend-core"
import { Automation, AutomationState, DocumentType } from "@budibase/types"
import { definitions } from "../../../automations/triggerInfo"
const REBOOT_CRON = "@reboot"
export async function exists(automationId: string) {
if (!automationId?.startsWith(DocumentType.AUTOMATION)) {
throw new Error("Invalid automation ID.")
}
const db = context.getAppDB()
return db.docExists(automationId)
}
export async function get(automationId: string) {
const db = context.getAppDB()
return (await db.get(automationId)) as Automation
}
export function disabled(automation: Automation) {
return automation.state === AutomationState.DISABLED || !hasSteps(automation)
}
export function isCron(automation: Automation) {
return (
automation?.definition.trigger &&
automation?.definition.trigger.stepId === definitions.CRON.stepId
)
}
export function isReboot(automation: Automation) {
const trigger = automation?.definition.trigger
return isCron(automation) && trigger?.inputs.cron === REBOOT_CRON
}
export function hasSteps(automation: Automation) {
return automation?.definition?.steps?.length > 0
}

View File

@ -1,9 +1,7 @@
import * as webhook from "./webhook" import * as webhook from "./webhook"
import * as utils from "./utils" import * as utils from "./utils"
import * as automations from "./automations"
export default { export default {
webhook, webhook,
utils, utils,
...automations,
} }

View File

@ -1,18 +1,13 @@
import { context, db, env, roles } from "@budibase/backend-core" import { db, env, roles } from "@budibase/backend-core"
import { features } from "@budibase/pro" import { features } from "@budibase/pro"
import { import {
DocumentType, DocumentType,
PermissionLevel, PermissionLevel,
PermissionSource, PermissionSource,
PlanType, PlanType,
Role,
VirtualDocumentType, VirtualDocumentType,
} from "@budibase/types" } from "@budibase/types"
import { import { extractViewInfoFromID, isViewID } from "../../../db/utils"
extractViewInfoFromID,
getRoleParams,
isViewID,
} from "../../../db/utils"
import { import {
CURRENTLY_SUPPORTED_LEVELS, CURRENTLY_SUPPORTED_LEVELS,
getBasePermissions, getBasePermissions,
@ -84,13 +79,8 @@ export async function allowsExplicitPermissions(resourceId: string) {
export async function getResourcePerms( export async function getResourcePerms(
resourceId: string resourceId: string
): Promise<ResourcePermissions> { ): Promise<ResourcePermissions> {
const db = context.getAppDB() const rolesList = await roles.getAllRoles()
const body = await db.allDocs(
getRoleParams(null, {
include_docs: true,
})
)
const rolesList = body.rows.map<Role>(row => row.doc)
let permissions: ResourcePermissions = {} let permissions: ResourcePermissions = {}
const permsToInherit = await getInheritablePermissions(resourceId) const permsToInherit = await getInheritablePermissions(resourceId)

View File

@ -10,7 +10,6 @@ mocks.licenses.useUnlimited()
import { init as dbInit } from "../../db" import { init as dbInit } from "../../db"
dbInit() dbInit()
import env from "../../environment" import env from "../../environment"
import { env as coreEnv } from "@budibase/backend-core"
import { import {
basicTable, basicTable,
basicRow, basicRow,
@ -33,6 +32,7 @@ import {
encryption, encryption,
auth, auth,
roles, roles,
env as coreEnv,
} from "@budibase/backend-core" } from "@budibase/backend-core"
import * as controllers from "./controllers" import * as controllers from "./controllers"
import { cleanup } from "../../utilities/fileSystem" import { cleanup } from "../../utilities/fileSystem"
@ -51,7 +51,6 @@ import {
UserRoles, UserRoles,
Automation, Automation,
} from "@budibase/types" } from "@budibase/types"
import { BUILTIN_ROLE_IDS } from "@budibase/backend-core/src/security/roles"
import API from "./api" import API from "./api"
@ -317,7 +316,7 @@ class TestConfiguration {
} }
} }
async createGroup(roleId: string = BUILTIN_ROLE_IDS.BASIC) { async createGroup(roleId: string = roles.BUILTIN_ROLE_IDS.BASIC) {
return context.doInTenant(this.tenantId!, async () => { return context.doInTenant(this.tenantId!, async () => {
const baseGroup = structures.userGroups.userGroup() const baseGroup = structures.userGroups.userGroup()
baseGroup.roles = { baseGroup.roles = {

View File

@ -1,6 +1,5 @@
import { default as threadUtils } from "./utils" import { default as threadUtils } from "./utils"
import { Job } from "bull" import { Job } from "bull"
threadUtils.threadSetup()
import { import {
disableCronById, disableCronById,
isErrorInOutput, isErrorInOutput,
@ -35,8 +34,8 @@ import { cloneDeep } from "lodash/fp"
import { performance } from "perf_hooks" import { performance } from "perf_hooks"
import * as sdkUtils from "../sdk/utils" import * as sdkUtils from "../sdk/utils"
import env from "../environment" import env from "../environment"
import sdk from "../sdk"
threadUtils.threadSetup()
const FILTER_STEP_ID = actions.BUILTIN_ACTION_DEFINITIONS.FILTER.stepId const FILTER_STEP_ID = actions.BUILTIN_ACTION_DEFINITIONS.FILTER.stepId
const LOOP_STEP_ID = actions.BUILTIN_ACTION_DEFINITIONS.LOOP.stepId const LOOP_STEP_ID = actions.BUILTIN_ACTION_DEFINITIONS.LOOP.stepId
const CRON_STEP_ID = triggerDefs.CRON.stepId const CRON_STEP_ID = triggerDefs.CRON.stepId
@ -520,8 +519,7 @@ class Orchestrator {
export function execute(job: Job<AutomationData>, callback: WorkerCallback) { export function execute(job: Job<AutomationData>, callback: WorkerCallback) {
const appId = job.data.event.appId const appId = job.data.event.appId
const automation = job.data.automation const automationId = job.data.automation._id
const automationId = automation._id
if (!appId) { if (!appId) {
throw new Error("Unable to execute, event doesn't contain app ID.") throw new Error("Unable to execute, event doesn't contain app ID.")
} }
@ -532,30 +530,10 @@ export function execute(job: Job<AutomationData>, callback: WorkerCallback) {
appId, appId,
automationId, automationId,
task: async () => { task: async () => {
let automation = job.data.automation,
isCron = sdk.automations.isCron(job.data.automation),
notFound = false
try {
automation = await sdk.automations.get(automationId)
} catch (err: any) {
// automation no longer exists
notFound = err
}
const disabled = sdk.automations.disabled(automation)
const stopAutomation = disabled || notFound
const envVars = await sdkUtils.getEnvironmentVariables() const envVars = await sdkUtils.getEnvironmentVariables()
// put into automation thread for whole context // put into automation thread for whole context
await context.doInEnvironmentContext(envVars, async () => { await context.doInEnvironmentContext(envVars, async () => {
const automationOrchestrator = new Orchestrator(job) const automationOrchestrator = new Orchestrator(job)
// hard stop on automations
if (isCron && stopAutomation) {
await automationOrchestrator.stopCron(
disabled ? "disabled" : "not_found"
)
}
if (stopAutomation) {
return
}
try { try {
const response = await automationOrchestrator.execute() const response = await automationOrchestrator.execute()
callback(null, response) callback(null, response)

View File

@ -100,10 +100,6 @@ export const AutomationStepIdArray = [
...Object.values(AutomationTriggerStepId), ...Object.values(AutomationTriggerStepId),
] ]
export enum AutomationState {
DISABLED = "disabled",
}
export interface Automation extends Document { export interface Automation extends Document {
definition: { definition: {
steps: AutomationStep[] steps: AutomationStep[]
@ -116,7 +112,6 @@ export interface Automation extends Document {
name: string name: string
internal?: boolean internal?: boolean
type?: string type?: string
state?: AutomationState
} }
interface BaseIOStructure { interface BaseIOStructure {

View File

@ -40,11 +40,6 @@ export type DatabasePutOpts = {
force?: boolean force?: boolean
} }
export type DocExistsResponse = {
_rev?: string
exists: boolean
}
export type DatabaseCreateIndexOpts = { export type DatabaseCreateIndexOpts = {
index: { index: {
fields: string[] fields: string[]
@ -95,7 +90,6 @@ export interface Database {
exists(): Promise<boolean> exists(): Promise<boolean>
checkSetup(): Promise<Nano.DocumentScope<any>> checkSetup(): Promise<Nano.DocumentScope<any>>
get<T>(id?: string): Promise<T> get<T>(id?: string): Promise<T>
docExists(id: string): Promise<DocExistsResponse>
remove( remove(
id: string | Document, id: string | Document,
rev?: string rev?: string

View File

@ -266,17 +266,14 @@ export const onboardUsers = async (ctx: Ctx<InviteUsersRequest>) => {
// Temp password to be passed to the user. // Temp password to be passed to the user.
createdPasswords[invite.email] = password createdPasswords[invite.email] = password
let builder: { global: boolean; apps?: string[] } = { global: false }
if (invite.userInfo.appBuilders) {
builder.apps = invite.userInfo.appBuilders
}
return { return {
email: invite.email, email: invite.email,
password, password,
forceResetPassword: true, forceResetPassword: true,
roles: invite.userInfo.apps, roles: invite.userInfo.apps,
admin: { global: false }, admin: invite.userInfo.admin,
builder, builder: invite.userInfo.builder,
tenantId: tenancy.getTenantId(), tenantId: tenancy.getTenantId(),
} }
}) })
@ -371,13 +368,10 @@ export const updateInvite = async (ctx: any) => {
...invite, ...invite,
} }
if (!updateBody?.appBuilders || !updateBody.appBuilders?.length) { if (!updateBody?.builder?.apps && updated.info?.builder?.apps) {
updated.info.appBuilders = [] updated.info.builder.apps = []
} else { } else if (updateBody?.builder) {
updated.info.appBuilders = [ updated.info.builder = updateBody.builder
...(invite.info.appBuilders ?? []),
...updateBody.appBuilders,
]
} }
if (!updateBody?.apps || !Object.keys(updateBody?.apps).length) { if (!updateBody?.apps || !Object.keys(updateBody?.apps).length) {
@ -409,15 +403,17 @@ export const inviteAccept = async (
lastName, lastName,
password, password,
email, email,
admin: { global: info?.admin?.global || false },
roles: info.apps, roles: info.apps,
tenantId: info.tenantId, tenantId: info.tenantId,
} }
let builder: { global: boolean; apps?: string[] } = { global: false } let builder: { global: boolean; apps?: string[] } = {
global: info?.builder?.global || false,
}
if (info.appBuilders) { if (info?.builder?.apps) {
builder.apps = info.appBuilders builder.apps = info.builder.apps
request.builder = builder request.builder = builder
delete info.appBuilders
} }
delete info.apps delete info.apps
request = { request = {

View File

@ -15,7 +15,11 @@ const { nodeExternalsPlugin } = require("esbuild-node-externals")
var argv = require("minimist")(process.argv.slice(2)) var argv = require("minimist")(process.argv.slice(2))
function runBuild(entry, outfile) { function runBuild(
entry,
outfile,
opts = { skipMeta: false, bundle: true, silent: false }
) {
const isDev = process.env.NODE_ENV !== "production" const isDev = process.env.NODE_ENV !== "production"
const tsconfig = argv["p"] || `tsconfig.build.json` const tsconfig = argv["p"] || `tsconfig.build.json`
const tsconfigPathPluginContent = JSON.parse( const tsconfigPathPluginContent = JSON.parse(
@ -36,12 +40,16 @@ function runBuild(entry, outfile) {
] ]
} }
const metafile = !opts.skipMeta
const { bundle } = opts
const sharedConfig = { const sharedConfig = {
entryPoints: [entry], entryPoints: [entry],
bundle: true, bundle,
minify: !isDev, minify: !isDev,
sourcemap: isDev, sourcemap: isDev,
tsconfig, tsconfig,
format: opts?.forcedFormat,
plugins: [ plugins: [
TsconfigPathsPlugin({ tsconfig: tsconfigPathPluginContent }), TsconfigPathsPlugin({ tsconfig: tsconfigPathPluginContent }),
nodeExternalsPlugin(), nodeExternalsPlugin(),
@ -50,15 +58,10 @@ function runBuild(entry, outfile) {
loader: { loader: {
".svelte": "copy", ".svelte": "copy",
}, },
metafile: true, metafile,
external: [ external: bundle
"deasync", ? ["deasync", "mock-aws-s3", "nock", "pino", "koa-pino-logger", "bull"]
"mock-aws-s3", : undefined,
"nock",
"pino",
"koa-pino-logger",
"bull",
],
} }
build({ build({
@ -71,16 +74,19 @@ function runBuild(entry, outfile) {
fs.copyFileSync(file, `${process.cwd()}/dist/${path.basename(file)}`) fs.copyFileSync(file, `${process.cwd()}/dist/${path.basename(file)}`)
} }
console.log( !opts.silent &&
"\x1b[32m%s\x1b[0m", console.log(
`Build successfully in ${(Date.now() - start) / 1000} seconds` "\x1b[32m%s\x1b[0m",
) `Build successfully in ${(Date.now() - start) / 1000} seconds`
)
}) })
fs.writeFileSync( if (metafile) {
`dist/${path.basename(outfile)}.meta.json`, fs.writeFileSync(
JSON.stringify(result.metafile) `dist/${path.basename(outfile)}.meta.json`,
) JSON.stringify(result.metafile)
)
}
}) })
} }

View File

@ -1,89 +0,0 @@
echo "Linking backend-core"
cd packages/backend-core
yarn unlink
yarn link
cd -
echo "Linking string-templates"
cd packages/string-templates
yarn unlink
yarn link
cd -
echo "Linking types"
cd packages/types
yarn unlink
yarn link
cd -
echo "Linking bbui"
cd packages/bbui
yarn unlink
yarn link
cd -
echo "Linking frontend-core"
cd packages/frontend-core
yarn unlink
yarn link
cd -
echo "Linking shared-core"
cd packages/shared-core
yarn unlink
yarn link
cd -
if [ -d packages/pro/src ]; then
pro_loaded_locally=true
else
pro_loaded_locally=false
fi
if [ $pro_loaded_locally = true ]; then
echo "Linking pro"
cd packages/pro
yarn unlink
yarn link
cd -
fi
if [ -d "../account-portal" ]; then
cd ../account-portal
echo "Bootstrapping account-portal"
yarn bootstrap
cd packages/server
echo "Linking backend-core to account-portal (server)"
yarn link "@budibase/backend-core"
echo "Linking string-templates to account-portal (server)"
yarn link "@budibase/string-templates"
echo "Linking types to account-portal (server)"
yarn link "@budibase/types"
echo "Linking shared-core to account-portal (server)"
yarn link "@budibase/shared-core"
if [ $pro_loaded_locally = true ]; then
echo "Linking pro to account-portal (server)"
yarn link "@budibase/pro"
fi
cd ../ui
echo "Linking bbui to account-portal (ui)"
yarn link "@budibase/bbui"
echo "Linking shared-core to account-portal (ui)"
yarn link "@budibase/shared-core"
echo "Linking string-templates to account-portal (ui)"
yarn link "@budibase/string-templates"
echo "Linking types to account-portal (ui)"
yarn link "@budibase/types"
echo "Linking frontend-core to account-portal (ui)"
yarn link "@budibase/frontend-core"
fi

View File

@ -1,16 +0,0 @@
#!/bin/bash
# Define the packages
PACKAGES=("@budibase/backend-core" "@budibase/worker" "@budibase/server" "@budibase/string-templates" "@budibase/types" "@budibase/shared-core")
# Generate the scope arguments
SCOPE_ARGS=""
for PACKAGE in "${PACKAGES[@]}"; do
SCOPE_ARGS+="--scope $PACKAGE "
done
# Run the commands with the scope arguments
for COMMAND in "$@"; do
echo "Running: $COMMAND $SCOPE_ARGS"
yarn $COMMAND $SCOPE_ARGS
done