diff --git a/README.md b/README.md
index 7d11ea570f..aa368d29fd 100644
--- a/README.md
+++ b/README.md
@@ -104,12 +104,14 @@ Budibase is made to scale. With Budibase, you can self-host on your own infrastr
## 🏁 Get started
-
+
-Deploy Budibase self-Hosted in your existing infrastructure, using Docker, Kubernetes, and Digital Ocean.
+Deploy Budibase self-hosted in your existing infrastructure, using Docker, Kubernetes, and Digital Ocean.
Or use Budibase Cloud if you don't need to self-host, and would like to get started quickly.
-### [Get started with Budibase](https://budibase.com)
+### [Get started with self-hosting Budibase](https://docs.budibase.com/self-hosting/self-host)
+
+### [Get started with Budibase Cloud](https://budibase.com)
diff --git a/i18n/README.jp.md b/i18n/README.jp.md
new file mode 100644
index 0000000000..6fea497d53
--- /dev/null
+++ b/i18n/README.jp.md
@@ -0,0 +1,214 @@
+
+
+
+
+
+
+ Budibase
+
+
+
+ 使って楽しいローコードプラットフォーム
+
+
+ Budibaseはオープンソースのローコードプラットフォームで、生産性を向上させるツールを簡単に構築することができます。
+
+
+
+ 🤖 🎨 🚀
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+## ✨ 特徴
+
+### "本物"のソフトウェアを構築できます
+ほかのプラットフォームとは違い、Budibaseだけでシングルページのアプリケーションを制作し完成させることができます。Budibaseで作られたアプリケーションは素晴らしいパフォーマンスを持っており、レスポンシブデザインにも対応しています。ユーザー達にいい印象を与えること間違いなしでしょう!
+
+
+### 拡張性が高くオープンソース
+Budibaseはオープンソースで、GPL v3ライセンスの下に公開されています。このことは、Budibaseが常にあなたのそばにいるという安心感を与えてくれることでしょう。そして、私たちは開発者に優しい環境を提供しているので、あなたは好きなだけにソースコードをフォークして改造、もしくは直接Budibaseにコントリビュートすることができます。
+
+
+### 既存のデータ、もしくは一から始める
+Budibaseはいろんなツールから既存のデータを使用できます。たとえばMongoDB、CouchDB、 PostgreSQL、MySQL、Airtable、S3、DynamoDB、REST APIなど。ほかのプラットフォームにない特徴として、Budibaseはデータなしの状態でビジネスアプリケーションの構築を一から始めることができます。 [新しいデータリソースをリクエスト](https://github.com/Budibase/budibase/discussions?discussions_q=category%3AIdeas)。
+
+
+
+
+
+
+### パワフルな内蔵コンポーネントでアプリケーションを設計し構築
+
+Budibaseには、美しくデザインされた強力なコンポーネントが付属しており、それら使用しUIを簡単に構築することができます。また、CSSによるスタイリングオプションも豊富に用意されているので、よりクリエイティブな表現もも可能です。
+ [Request new component](https://github.com/Budibase/budibase/discussions?discussions_q=category%3AIdeas)。
+
+
+
+
+
+
+### プロセスを自動化し、ほかのツールと連携し、Webhookをでつながる!
+定型化した作業を自動化して時間を節約しましょう。Webhookに接続、Eメールの自動送信など、すべてBudibaseに任せましょう。 こちらで簡単に [新しいオートメーションを作る](https://github.com/Budibase/automations)または[新しいオートメーションをリクエストすることができます](https://github.com/Budibase/budibase/discussions?discussions_q=category%3AIdeas)。
+
+
+
+
+
+
+### 使い親しんだツールとの統合
+Budibaseは多くの人気ツールと統合されており、あなたのニーズに合わせたパーフェクトなアプリケーションを構築することができます。
+
+
+
+
+
+
+### 管理者のパラダイス
+Budibaseはどんな規模のプロジェクトにも柔軟に対応できます。Budibaseを使えば、個人または組織のサーバーでセルフホスティングし、ユーザー、オンボーディング、SMTP、アプリ、グループ、テーマなどをひとまとめに管理することが可能です。また、ユーザーやグループにアプリポータルを提供し、グループ管理者にユーザー管理を委ねることも可能です。
+- プロモーションビデオを視聴する: https://youtu.be/xoljVpty_Kw
+
+
+
+## 🏁 始めましょう
+
+
+
+Docker、KubernetesもしくはDegital Oceanを使用しセルフホスティングするか、セルフホスティングに困難がある、もしくは今すぐ開始したい場合はBudibase Cloudを使用しすぐに始めましょう。
+
+### [Budibaseをセルフホスティングする](https://docs.budibase.com/self-hosting/self-host)
+
+### [Budibase Cloudを使用する](https://budibase.com)
+
+
+
+
+## 🎓 Budibaseを学ぶ
+
+Budibaseのドキュメント[はここです](https://docs.budibase.com)。
+
+
+
+
+
+## 💬 コミュニティ
+
+もし何か問題がある、もしくはBudibaseコミュニティのほかのユーザーと交流したいのであれば私たちの[Github discussions](https://github.com/Budibase/budibase/discussions)までお越しください。
+
+
+
+
+## ❗ 行動規範
+
+Budibase は、すべての人を歓迎し、多様で、ハラスメントのない環境を提供することに尽力しています。Budibase コミュニティに参加するすべての人たちが私たちの[**行動規範**](https://github.com/Budibase/budibase/blob/HEAD/.github/CODE_OF_CONDUCT.md)を遵守していただくことお願いします。必ず読んでください。
+
+
+
+
+
+
+## 🙌 Budibaseにコントリビュート
+
+
+バグレポートからプルリクエストの作成まで、すべての貢献は感謝、そして歓迎されております。新しい新機能の実装やAPIの変更を計画している場合は、まずIssueを作成してください。これであなたの貴重な考えは私たちにも伝わり、無駄とはなりません。
+
+### どこから始めるか混乱していますか?
+ここはコントリビュートをはじめるための最適な場所です! [First time issues project](https://github.com/Budibase/budibase/projects/22).
+
+### リポジトリの構成
+Budibaseは、lernaによってmonorepo方式で管理されています。budibase パッケージのビルドと公開はlernaによって管理されています。Budibaseを構成するパッケージは以下の通り:
+
+- [packages/builder](https://github.com/Budibase/budibase/tree/HEAD/packages/builder) - budibase builder クライアントサイドのsvelteアプリケーションのコードが含まれています。
+
+- [packages/client](https://github.com/Budibase/budibase/tree/HEAD/packages/client) - ブラウザ上で動作するモジュールで、JSONの定義を読み取り、そこから"生きている"Webアプリケーションを作成します。
+
+- [packages/server](https://github.com/Budibase/budibase/tree/HEAD/packages/server) - budibaseのサーバーです。この Koa アプリは、builder アプリと budibase アプリの JS を提供し、データベースとファイル システムと対話するための API を提供する役割を担っています。
+
+詳しくは[CONTRIBUTING.md](https://github.com/Budibase/budibase/blob/HEAD/.github/CONTRIBUTING.md)をご覧ください。
+
+
+
+
+## 📝 ライセンス
+
+Budibase はオープンソースであり、[GPL v3](https://www.gnu.org/licenses/gpl-3.0.en.html)ライセンスの下に公開されています。クライアントとコンポーネントライブラリは [MPL](https://directory.fsf.org/wiki/License:MPL-2.0)で公開されています - ですから、あなたが制作したアプリケーションはどのようなライセンスでも公開することができます。
+
+
+
+## ⭐ スター数の履歴
+
+[![Stargazers over time](https://starchart.cc/Budibase/budibase.svg)](https://starchart.cc/Budibase/budibase)
+
+ビルダーのアップデートの間に問題が発生する場合は[ここ](https://github.com/Budibase/budibase/blob/HEAD/.github/CONTRIBUTING.md#troubleshooting)を参考に環境をクリアにしてください。
+
+
+
+## Contributors ✨
+
+すばらしい皆さまに感謝しかありません。([emoji key](https://allcontributors.org/docs/en/emoji-key)):
+
+
+
+
+
+
+
+
+
+
+
+このプロジェクトは、[all-contributors](https://github.com/all-contributors/all-contributors)仕様に準拠しています。どのような貢献でも歓迎します。
+
diff --git a/lerna.json b/lerna.json
index 9e1abe4e79..b0db0bf381 100644
--- a/lerna.json
+++ b/lerna.json
@@ -1,5 +1,5 @@
{
- "version": "1.0.49-alpha.5",
+ "version": "1.0.50-alpha.6",
"npmClient": "yarn",
"packages": [
"packages/*"
diff --git a/packages/backend-core/context.js b/packages/backend-core/context.js
new file mode 100644
index 0000000000..4bc100687d
--- /dev/null
+++ b/packages/backend-core/context.js
@@ -0,0 +1,17 @@
+const {
+ getAppDB,
+ getDevAppDB,
+ getProdAppDB,
+ getAppId,
+ updateAppId,
+ doInAppContext,
+} = require("./src/context")
+
+module.exports = {
+ getAppDB,
+ getDevAppDB,
+ getProdAppDB,
+ getAppId,
+ updateAppId,
+ doInAppContext,
+}
diff --git a/packages/backend-core/db.js b/packages/backend-core/db.js
index 47854ca9c7..d2adf6c092 100644
--- a/packages/backend-core/db.js
+++ b/packages/backend-core/db.js
@@ -1,5 +1,6 @@
module.exports = {
...require("./src/db/utils"),
...require("./src/db/constants"),
+ ...require("./src/db"),
...require("./src/db/views"),
}
diff --git a/packages/backend-core/deprovision.js b/packages/backend-core/deprovision.js
index b4b8dc6110..672da214ff 100644
--- a/packages/backend-core/deprovision.js
+++ b/packages/backend-core/deprovision.js
@@ -1 +1 @@
-module.exports = require("./src/tenancy/deprovision")
+module.exports = require("./src/context/deprovision")
diff --git a/packages/backend-core/package.json b/packages/backend-core/package.json
index c416c2c4e3..55ca8c39c5 100644
--- a/packages/backend-core/package.json
+++ b/packages/backend-core/package.json
@@ -1,6 +1,6 @@
{
"name": "@budibase/backend-core",
- "version": "1.0.49-alpha.5",
+ "version": "1.0.50-alpha.6",
"description": "Budibase backend core libraries used in server and worker",
"main": "src/index.js",
"author": "Budibase",
diff --git a/packages/backend-core/src/tenancy/FunctionContext.js b/packages/backend-core/src/context/FunctionContext.js
similarity index 70%
rename from packages/backend-core/src/tenancy/FunctionContext.js
rename to packages/backend-core/src/context/FunctionContext.js
index d97a3a30b4..1a3f65056e 100644
--- a/packages/backend-core/src/tenancy/FunctionContext.js
+++ b/packages/backend-core/src/context/FunctionContext.js
@@ -4,8 +4,8 @@ const { newid } = require("../hashing")
const REQUEST_ID_KEY = "requestId"
class FunctionContext {
- static getMiddleware(updateCtxFn = null) {
- const namespace = this.createNamespace()
+ static getMiddleware(updateCtxFn = null, contextName = "session") {
+ const namespace = this.createNamespace(contextName)
return async function (ctx, next) {
await new Promise(
@@ -24,14 +24,14 @@ class FunctionContext {
}
}
- static run(callback) {
- const namespace = this.createNamespace()
+ static run(callback, contextName = "session") {
+ const namespace = this.createNamespace(contextName)
return namespace.runAndReturn(callback)
}
- static setOnContext(key, value) {
- const namespace = this.createNamespace()
+ static setOnContext(key, value, contextName = "session") {
+ const namespace = this.createNamespace(contextName)
namespace.set(key, value)
}
@@ -55,16 +55,16 @@ class FunctionContext {
}
}
- static destroyNamespace() {
+ static destroyNamespace(name = "session") {
if (this._namespace) {
- cls.destroyNamespace("session")
+ cls.destroyNamespace(name)
this._namespace = null
}
}
- static createNamespace() {
+ static createNamespace(name = "session") {
if (!this._namespace) {
- this._namespace = cls.createNamespace("session")
+ this._namespace = cls.createNamespace(name)
}
return this._namespace
}
diff --git a/packages/backend-core/src/tenancy/deprovision.js b/packages/backend-core/src/context/deprovision.js
similarity index 98%
rename from packages/backend-core/src/tenancy/deprovision.js
rename to packages/backend-core/src/context/deprovision.js
index 608ca1b84a..1fbc2c8398 100644
--- a/packages/backend-core/src/tenancy/deprovision.js
+++ b/packages/backend-core/src/context/deprovision.js
@@ -1,6 +1,6 @@
const { getGlobalUserParams, getAllApps } = require("../db/utils")
const { getDB, getCouch } = require("../db")
-const { getGlobalDB } = require("./tenancy")
+const { getGlobalDB } = require("../tenancy")
const { StaticDatabases } = require("../db/constants")
const TENANT_DOC = StaticDatabases.PLATFORM_INFO.docs.tenants
diff --git a/packages/backend-core/src/context/index.js b/packages/backend-core/src/context/index.js
new file mode 100644
index 0000000000..1c1238278e
--- /dev/null
+++ b/packages/backend-core/src/context/index.js
@@ -0,0 +1,195 @@
+const env = require("../environment")
+const { Headers } = require("../../constants")
+const cls = require("./FunctionContext")
+const { getCouch } = require("../db")
+const { getProdAppID, getDevelopmentAppID } = require("../db/conversions")
+const { isEqual } = require("lodash")
+
+// some test cases call functions directly, need to
+// store an app ID to pretend there is a context
+let TEST_APP_ID = null
+
+const ContextKeys = {
+ TENANT_ID: "tenantId",
+ APP_ID: "appId",
+ // whatever the request app DB was
+ CURRENT_DB: "currentDb",
+ // get the prod app DB from the request
+ PROD_DB: "prodDb",
+ // get the dev app DB from the request
+ DEV_DB: "devDb",
+ DB_OPTS: "dbOpts",
+}
+
+exports.DEFAULT_TENANT_ID = "default"
+
+exports.isDefaultTenant = () => {
+ return exports.getTenantId() === exports.DEFAULT_TENANT_ID
+}
+
+exports.isMultiTenant = () => {
+ return env.MULTI_TENANCY
+}
+
+// used for automations, API endpoints should always be in context already
+exports.doInTenant = (tenantId, task) => {
+ return cls.run(() => {
+ // set the tenant id
+ cls.setOnContext(ContextKeys.TENANT_ID, tenantId)
+
+ // invoke the task
+ return task()
+ })
+}
+
+exports.doInAppContext = (appId, task) => {
+ return cls.run(() => {
+ // set the app ID
+ cls.setOnContext(ContextKeys.APP_ID, appId)
+
+ // invoke the task
+ return task()
+ })
+}
+
+exports.updateTenantId = tenantId => {
+ cls.setOnContext(ContextKeys.TENANT_ID, tenantId)
+}
+
+exports.updateAppId = appId => {
+ try {
+ cls.setOnContext(ContextKeys.APP_ID, appId)
+ cls.setOnContext(ContextKeys.PROD_DB, null)
+ cls.setOnContext(ContextKeys.DEV_DB, null)
+ cls.setOnContext(ContextKeys.CURRENT_DB, null)
+ cls.setOnContext(ContextKeys.DB_OPTS, null)
+ } catch (err) {
+ if (env.isTest()) {
+ TEST_APP_ID = appId
+ } else {
+ throw err
+ }
+ }
+}
+
+exports.setTenantId = (
+ ctx,
+ opts = { allowQs: false, allowNoTenant: false }
+) => {
+ let tenantId
+ // exit early if not multi-tenant
+ if (!exports.isMultiTenant()) {
+ cls.setOnContext(ContextKeys.TENANT_ID, this.DEFAULT_TENANT_ID)
+ return
+ }
+
+ const allowQs = opts && opts.allowQs
+ const allowNoTenant = opts && opts.allowNoTenant
+ const header = ctx.request.headers[Headers.TENANT_ID]
+ const user = ctx.user || {}
+ if (allowQs) {
+ const query = ctx.request.query || {}
+ tenantId = query.tenantId
+ }
+ // override query string (if allowed) by user, or header
+ // URL params cannot be used in a middleware, as they are
+ // processed later in the chain
+ tenantId = user.tenantId || header || tenantId
+
+ // Set the tenantId from the subdomain
+ if (!tenantId) {
+ tenantId = ctx.subdomains && ctx.subdomains[0]
+ }
+
+ if (!tenantId && !allowNoTenant) {
+ ctx.throw(403, "Tenant id not set")
+ }
+ // check tenant ID just incase no tenant was allowed
+ if (tenantId) {
+ cls.setOnContext(ContextKeys.TENANT_ID, tenantId)
+ }
+}
+
+exports.isTenantIdSet = () => {
+ const tenantId = cls.getFromContext(ContextKeys.TENANT_ID)
+ return !!tenantId
+}
+
+exports.getTenantId = () => {
+ if (!exports.isMultiTenant()) {
+ return exports.DEFAULT_TENANT_ID
+ }
+ const tenantId = cls.getFromContext(ContextKeys.TENANT_ID)
+ if (!tenantId) {
+ throw Error("Tenant id not found")
+ }
+ return tenantId
+}
+
+exports.getAppId = () => {
+ const foundId = cls.getFromContext(ContextKeys.APP_ID)
+ if (!foundId && env.isTest() && TEST_APP_ID) {
+ return TEST_APP_ID
+ } else {
+ return foundId
+ }
+}
+
+function getDB(key, opts) {
+ const dbOptsKey = `${key}${ContextKeys.DB_OPTS}`
+ let storedOpts = cls.getFromContext(dbOptsKey)
+ let db = cls.getFromContext(key)
+ if (db && isEqual(opts, storedOpts)) {
+ return db
+ }
+ const appId = exports.getAppId()
+ const CouchDB = getCouch()
+ let toUseAppId
+ switch (key) {
+ case ContextKeys.CURRENT_DB:
+ toUseAppId = appId
+ break
+ case ContextKeys.PROD_DB:
+ toUseAppId = getProdAppID(appId)
+ break
+ case ContextKeys.DEV_DB:
+ toUseAppId = getDevelopmentAppID(appId)
+ break
+ }
+ db = new CouchDB(toUseAppId, opts)
+ try {
+ cls.setOnContext(key, db)
+ if (opts) {
+ cls.setOnContext(dbOptsKey, opts)
+ }
+ } catch (err) {
+ if (!env.isTest()) {
+ throw err
+ }
+ }
+ return db
+}
+
+/**
+ * Opens the app database based on whatever the request
+ * contained, dev or prod.
+ */
+exports.getAppDB = opts => {
+ return getDB(ContextKeys.CURRENT_DB, opts)
+}
+
+/**
+ * This specifically gets the prod app ID, if the request
+ * contained a development app ID, this will open the prod one.
+ */
+exports.getProdAppDB = opts => {
+ return getDB(ContextKeys.PROD_DB, opts)
+}
+
+/**
+ * This specifically gets the dev app ID, if the request
+ * contained a prod app ID, this will open the dev one.
+ */
+exports.getDevAppDB = opts => {
+ return getDB(ContextKeys.DEV_DB, opts)
+}
diff --git a/packages/backend-core/src/db/constants.js b/packages/backend-core/src/db/constants.js
index 2affb09c7c..b41a9a9c08 100644
--- a/packages/backend-core/src/db/constants.js
+++ b/packages/backend-core/src/db/constants.js
@@ -32,3 +32,7 @@ exports.StaticDatabases = {
},
},
}
+
+exports.APP_PREFIX = exports.DocumentTypes.APP + exports.SEPARATOR
+exports.APP_DEV = exports.APP_DEV_PREFIX =
+ exports.DocumentTypes.APP_DEV + exports.SEPARATOR
diff --git a/packages/backend-core/src/db/conversions.js b/packages/backend-core/src/db/conversions.js
new file mode 100644
index 0000000000..50d896322f
--- /dev/null
+++ b/packages/backend-core/src/db/conversions.js
@@ -0,0 +1,46 @@
+const NO_APP_ERROR = "No app provided"
+const { APP_DEV_PREFIX, APP_PREFIX } = require("./constants")
+
+exports.isDevAppID = appId => {
+ if (!appId) {
+ throw NO_APP_ERROR
+ }
+ return appId.startsWith(APP_DEV_PREFIX)
+}
+
+exports.isProdAppID = appId => {
+ if (!appId) {
+ throw NO_APP_ERROR
+ }
+ return appId.startsWith(APP_PREFIX) && !exports.isDevAppID(appId)
+}
+
+exports.isDevApp = app => {
+ if (!app) {
+ throw NO_APP_ERROR
+ }
+ return exports.isDevAppID(app.appId)
+}
+
+/**
+ * Convert a development app ID to a deployed app ID.
+ */
+exports.getProdAppID = appId => {
+ // if dev, convert it
+ if (appId.startsWith(APP_DEV_PREFIX)) {
+ const id = appId.split(APP_DEV_PREFIX)[1]
+ return `${APP_PREFIX}${id}`
+ }
+ return appId
+}
+
+/**
+ * Convert a deployed app ID to a development app ID.
+ */
+exports.getDevelopmentAppID = appId => {
+ if (!appId.startsWith(APP_DEV_PREFIX)) {
+ const id = appId.split(APP_PREFIX)[1]
+ return `${APP_DEV_PREFIX}${id}`
+ }
+ return appId
+}
diff --git a/packages/backend-core/src/db/utils.js b/packages/backend-core/src/db/utils.js
index 2bc5462646..f5ea2f8486 100644
--- a/packages/backend-core/src/db/utils.js
+++ b/packages/backend-core/src/db/utils.js
@@ -2,7 +2,13 @@ const { newid } = require("../hashing")
const Replication = require("./Replication")
const { DEFAULT_TENANT_ID, Configs } = require("../constants")
const env = require("../environment")
-const { StaticDatabases, SEPARATOR, DocumentTypes } = require("./constants")
+const {
+ StaticDatabases,
+ SEPARATOR,
+ DocumentTypes,
+ APP_PREFIX,
+ APP_DEV,
+} = require("./constants")
const {
getTenantId,
getTenantIDFromAppID,
@@ -12,8 +18,13 @@ const fetch = require("node-fetch")
const { getCouch } = require("./index")
const { getAppMetadata } = require("../cache/appMetadata")
const { checkSlashesInUrl } = require("../helpers")
-
-const NO_APP_ERROR = "No app provided"
+const {
+ isDevApp,
+ isProdAppID,
+ isDevAppID,
+ getDevelopmentAppID,
+ getProdAppID,
+} = require("./conversions")
const UNICODE_MAX = "\ufff0"
@@ -24,10 +35,15 @@ exports.ViewNames = {
exports.StaticDatabases = StaticDatabases
exports.DocumentTypes = DocumentTypes
-exports.APP_PREFIX = DocumentTypes.APP + SEPARATOR
-exports.APP_DEV = exports.APP_DEV_PREFIX = DocumentTypes.APP_DEV + SEPARATOR
+exports.APP_PREFIX = APP_PREFIX
+exports.APP_DEV = exports.APP_DEV_PREFIX = APP_DEV
exports.SEPARATOR = SEPARATOR
exports.getTenantIDFromAppID = getTenantIDFromAppID
+exports.isDevApp = isDevApp
+exports.isProdAppID = isProdAppID
+exports.isDevAppID = isDevAppID
+exports.getDevelopmentAppID = getDevelopmentAppID
+exports.getProdAppID = getProdAppID
/**
* If creating DB allDocs/query params with only a single top level ID this can be used, this
@@ -52,27 +68,6 @@ function getDocParams(docType, docId = null, otherProps = {}) {
}
}
-exports.isDevAppID = appId => {
- if (!appId) {
- throw NO_APP_ERROR
- }
- return appId.startsWith(exports.APP_DEV_PREFIX)
-}
-
-exports.isProdAppID = appId => {
- if (!appId) {
- throw NO_APP_ERROR
- }
- return appId.startsWith(exports.APP_PREFIX) && !exports.isDevAppID(appId)
-}
-
-function isDevApp(app) {
- if (!app) {
- throw NO_APP_ERROR
- }
- return exports.isDevAppID(app.appId)
-}
-
/**
* Generates a new workspace ID.
* @returns {string} The new workspace ID which the workspace doc can be stored under.
@@ -157,29 +152,6 @@ exports.getRoleParams = (roleId = null, otherProps = {}) => {
return getDocParams(DocumentTypes.ROLE, roleId, otherProps)
}
-/**
- * Convert a development app ID to a deployed app ID.
- */
-exports.getDeployedAppID = appId => {
- // if dev, convert it
- if (appId.startsWith(exports.APP_DEV_PREFIX)) {
- const id = appId.split(exports.APP_DEV_PREFIX)[1]
- return `${exports.APP_PREFIX}${id}`
- }
- return appId
-}
-
-/**
- * Convert a deployed app ID to a development app ID.
- */
-exports.getDevelopmentAppID = appId => {
- if (!appId.startsWith(exports.APP_DEV_PREFIX)) {
- const id = appId.split(exports.APP_PREFIX)[1]
- return `${exports.APP_DEV_PREFIX}${id}`
- }
- return appId
-}
-
exports.getCouchUrl = () => {
if (!env.COUCH_DB_URL) return
@@ -225,7 +197,7 @@ exports.getAllDbs = async () => {
}
let couchUrl = `${exports.getCouchUrl()}/_all_dbs`
let tenantId = getTenantId()
- if (!env.MULTI_TENANCY || tenantId == DEFAULT_TENANT_ID) {
+ if (!env.MULTI_TENANCY || tenantId === DEFAULT_TENANT_ID) {
// just get all DBs when:
// - single tenancy
// - default tenant
@@ -250,11 +222,11 @@ exports.getAllDbs = async () => {
/**
* Lots of different points in the system need to find the full list of apps, this will
* enumerate the entire CouchDB cluster and get the list of databases (every app).
- * NOTE: this operation is fine in self hosting, but cannot be used when hosting many
- * different users/companies apps as there is no security around it - all apps are returned.
+ *
* @return {Promise} returns the app information document stored in each app database.
*/
-exports.getAllApps = async (CouchDB, { dev, all, idsOnly } = {}) => {
+exports.getAllApps = async ({ dev, all, idsOnly } = {}) => {
+ const CouchDB = getCouch()
let tenantId = getTenantId()
if (!env.MULTI_TENANCY && !tenantId) {
tenantId = DEFAULT_TENANT_ID
@@ -310,8 +282,8 @@ exports.getAllApps = async (CouchDB, { dev, all, idsOnly } = {}) => {
/**
* Utility function for getAllApps but filters to production apps only.
*/
-exports.getDeployedAppIDs = async CouchDB => {
- return (await exports.getAllApps(CouchDB, { idsOnly: true })).filter(
+exports.getProdAppIDs = async () => {
+ return (await exports.getAllApps({ idsOnly: true })).filter(
id => !exports.isDevAppID(id)
)
}
@@ -319,13 +291,14 @@ exports.getDeployedAppIDs = async CouchDB => {
/**
* Utility function for the inverse of above.
*/
-exports.getDevAppIDs = async CouchDB => {
- return (await exports.getAllApps(CouchDB, { idsOnly: true })).filter(id =>
+exports.getDevAppIDs = async () => {
+ return (await exports.getAllApps({ idsOnly: true })).filter(id =>
exports.isDevAppID(id)
)
}
-exports.dbExists = async (CouchDB, dbName) => {
+exports.dbExists = async dbName => {
+ const CouchDB = getCouch()
let exists = false
try {
const db = CouchDB(dbName, { skip_setup: true })
diff --git a/packages/backend-core/src/middleware/appTenancy.js b/packages/backend-core/src/middleware/appTenancy.js
index 30fc4f7453..b0430a0051 100644
--- a/packages/backend-core/src/middleware/appTenancy.js
+++ b/packages/backend-core/src/middleware/appTenancy.js
@@ -3,8 +3,9 @@ const {
updateTenantId,
isTenantIdSet,
DEFAULT_TENANT_ID,
+ updateAppId,
} = require("../tenancy")
-const ContextFactory = require("../tenancy/FunctionContext")
+const ContextFactory = require("../context/FunctionContext")
const { getTenantIDFromAppID } = require("../db/utils")
module.exports = () => {
@@ -21,5 +22,6 @@ module.exports = () => {
const appId = ctx.appId ? ctx.appId : ctx.user ? ctx.user.appId : null
const tenantId = getTenantIDFromAppID(appId) || DEFAULT_TENANT_ID
updateTenantId(tenantId)
+ updateAppId(appId)
})
}
diff --git a/packages/backend-core/src/middleware/passport/local.js b/packages/backend-core/src/middleware/passport/local.js
index f95c3a173e..2149bd3e18 100644
--- a/packages/backend-core/src/middleware/passport/local.js
+++ b/packages/backend-core/src/middleware/passport/local.js
@@ -8,7 +8,7 @@ const { newid } = require("../../hashing")
const { createASession } = require("../../security/sessions")
const { getTenantId } = require("../../tenancy")
-const INVALID_ERR = "Invalid Credentials"
+const INVALID_ERR = "Invalid credentials"
const SSO_NO_PASSWORD = "SSO user does not have a password set"
const EXPIRED = "This account has expired. Please reset your password"
diff --git a/packages/backend-core/src/middleware/tenancy.js b/packages/backend-core/src/middleware/tenancy.js
index adfd36a503..5bb81f8824 100644
--- a/packages/backend-core/src/middleware/tenancy.js
+++ b/packages/backend-core/src/middleware/tenancy.js
@@ -1,5 +1,5 @@
const { setTenantId } = require("../tenancy")
-const ContextFactory = require("../tenancy/FunctionContext")
+const ContextFactory = require("../context/FunctionContext")
const { buildMatcherRegex, matches } = require("./matchers")
module.exports = (
diff --git a/packages/backend-core/src/security/roles.js b/packages/backend-core/src/security/roles.js
index 8529dde6f4..11abc70bdd 100644
--- a/packages/backend-core/src/security/roles.js
+++ b/packages/backend-core/src/security/roles.js
@@ -1,4 +1,3 @@
-const { getDB } = require("../db")
const { cloneDeep } = require("lodash/fp")
const { BUILTIN_PERMISSION_IDS } = require("./permissions")
const {
@@ -7,6 +6,8 @@ const {
DocumentTypes,
SEPARATOR,
} = require("../db/utils")
+const { getAppDB } = require("../context")
+const { getDB } = require("../db")
const BUILTIN_IDS = {
ADMIN: "ADMIN",
@@ -111,11 +112,10 @@ exports.lowerBuiltinRoleID = (roleId1, roleId2) => {
/**
* Gets the role object, this is mainly useful for two purposes, to check if the level exists and
* to check if the role inherits any others.
- * @param {string} appId The app in which to look for the role.
* @param {string|null} roleId The level ID to lookup.
* @returns {Promise} The role object, which may contain an "inherits" property.
*/
-exports.getRole = async (appId, roleId) => {
+exports.getRole = async roleId => {
if (!roleId) {
return null
}
@@ -128,7 +128,7 @@ exports.getRole = async (appId, roleId) => {
)
}
try {
- const db = getDB(appId)
+ const db = getAppDB()
const dbRole = await db.get(exports.getDBRoleID(roleId))
role = Object.assign(role, dbRole)
// finalise the ID
@@ -145,11 +145,12 @@ exports.getRole = async (appId, roleId) => {
/**
* Simple function to get all the roles based on the top level user role ID.
*/
-async function getAllUserRoles(appId, userRoleId) {
- if (!userRoleId) {
- return [BUILTIN_IDS.BASIC]
+async function getAllUserRoles(userRoleId) {
+ // admins have access to all roles
+ if (userRoleId === BUILTIN_IDS.ADMIN) {
+ return exports.getAllRoles()
}
- let currentRole = await exports.getRole(appId, userRoleId)
+ let currentRole = await exports.getRole(userRoleId)
let roles = currentRole ? [currentRole] : []
let roleIds = [userRoleId]
// get all the inherited roles
@@ -159,7 +160,7 @@ async function getAllUserRoles(appId, userRoleId) {
roleIds.indexOf(currentRole.inherits) === -1
) {
roleIds.push(currentRole.inherits)
- currentRole = await exports.getRole(appId, currentRole.inherits)
+ currentRole = await exports.getRole(currentRole.inherits)
roles.push(currentRole)
}
return roles
@@ -168,29 +169,23 @@ async function getAllUserRoles(appId, userRoleId) {
/**
* Returns an ordered array of the user's inherited role IDs, this can be used
* to determine if a user can access something that requires a specific role.
- * @param {string} appId The ID of the application from which roles should be obtained.
* @param {string} userRoleId The user's role ID, this can be found in their access token.
* @param {object} opts Various options, such as whether to only retrieve the IDs (default true).
* @returns {Promise} returns an ordered array of the roles, with the first being their
* highest level of access and the last being the lowest level.
*/
-exports.getUserRoleHierarchy = async (
- appId,
- userRoleId,
- opts = { idOnly: true }
-) => {
+exports.getUserRoleHierarchy = async (userRoleId, opts = { idOnly: true }) => {
// special case, if they don't have a role then they are a public user
- const roles = await getAllUserRoles(appId, userRoleId)
+ const roles = await getAllUserRoles(userRoleId)
return opts.idOnly ? roles.map(role => role._id) : roles
}
/**
* Given an app ID this will retrieve all of the roles that are currently within that app.
- * @param {string} appId The ID of the app to retrieve the roles from.
* @return {Promise} An array of the role objects that were found.
*/
exports.getAllRoles = async appId => {
- const db = getDB(appId)
+ const db = appId ? getDB(appId) : getAppDB()
const body = await db.allDocs(
getRoleParams(null, {
include_docs: true,
@@ -218,19 +213,17 @@ exports.getAllRoles = async appId => {
}
/**
- * This retrieves the required role/
- * @param appId
+ * This retrieves the required role
* @param permLevel
* @param resourceId
* @param subResourceId
* @return {Promise<{permissions}|Object>}
*/
exports.getRequiredResourceRole = async (
- appId,
permLevel,
{ resourceId, subResourceId }
) => {
- const roles = await exports.getAllRoles(appId)
+ const roles = await exports.getAllRoles()
let main = [],
sub = []
for (let role of roles) {
@@ -251,8 +244,7 @@ exports.getRequiredResourceRole = async (
}
class AccessController {
- constructor(appId) {
- this.appId = appId
+ constructor() {
this.userHierarchies = {}
}
@@ -270,7 +262,7 @@ class AccessController {
}
let roleIds = this.userHierarchies[userRoleId]
if (!roleIds) {
- roleIds = await exports.getUserRoleHierarchy(this.appId, userRoleId)
+ roleIds = await exports.getUserRoleHierarchy(userRoleId)
this.userHierarchies[userRoleId] = roleIds
}
diff --git a/packages/backend-core/src/tenancy/context.js b/packages/backend-core/src/tenancy/context.js
deleted file mode 100644
index 01d1fdc604..0000000000
--- a/packages/backend-core/src/tenancy/context.js
+++ /dev/null
@@ -1,84 +0,0 @@
-const env = require("../environment")
-const { Headers } = require("../../constants")
-const cls = require("./FunctionContext")
-
-exports.DEFAULT_TENANT_ID = "default"
-
-exports.isDefaultTenant = () => {
- return exports.getTenantId() === exports.DEFAULT_TENANT_ID
-}
-
-exports.isMultiTenant = () => {
- return env.MULTI_TENANCY
-}
-
-const TENANT_ID = "tenantId"
-
-// used for automations, API endpoints should always be in context already
-exports.doInTenant = (tenantId, task) => {
- return cls.run(() => {
- // set the tenant id
- cls.setOnContext(TENANT_ID, tenantId)
-
- // invoke the task
- return task()
- })
-}
-
-exports.updateTenantId = tenantId => {
- cls.setOnContext(TENANT_ID, tenantId)
-}
-
-exports.setTenantId = (
- ctx,
- opts = { allowQs: false, allowNoTenant: false }
-) => {
- let tenantId
- // exit early if not multi-tenant
- if (!exports.isMultiTenant()) {
- cls.setOnContext(TENANT_ID, this.DEFAULT_TENANT_ID)
- return
- }
-
- const allowQs = opts && opts.allowQs
- const allowNoTenant = opts && opts.allowNoTenant
- const header = ctx.request.headers[Headers.TENANT_ID]
- const user = ctx.user || {}
- if (allowQs) {
- const query = ctx.request.query || {}
- tenantId = query.tenantId
- }
- // override query string (if allowed) by user, or header
- // URL params cannot be used in a middleware, as they are
- // processed later in the chain
- tenantId = user.tenantId || header || tenantId
-
- // Set the tenantId from the subdomain
- if (!tenantId) {
- tenantId = ctx.subdomains && ctx.subdomains[0]
- }
-
- if (!tenantId && !allowNoTenant) {
- ctx.throw(403, "Tenant id not set")
- }
- // check tenant ID just incase no tenant was allowed
- if (tenantId) {
- cls.setOnContext(TENANT_ID, tenantId)
- }
-}
-
-exports.isTenantIdSet = () => {
- const tenantId = cls.getFromContext(TENANT_ID)
- return !!tenantId
-}
-
-exports.getTenantId = () => {
- if (!exports.isMultiTenant()) {
- return exports.DEFAULT_TENANT_ID
- }
- const tenantId = cls.getFromContext(TENANT_ID)
- if (!tenantId) {
- throw Error("Tenant id not found")
- }
- return tenantId
-}
diff --git a/packages/backend-core/src/tenancy/index.js b/packages/backend-core/src/tenancy/index.js
index 2fe257d885..c847033a12 100644
--- a/packages/backend-core/src/tenancy/index.js
+++ b/packages/backend-core/src/tenancy/index.js
@@ -1,4 +1,4 @@
module.exports = {
- ...require("./context"),
+ ...require("../context"),
...require("./tenancy"),
}
diff --git a/packages/backend-core/src/tenancy/tenancy.js b/packages/backend-core/src/tenancy/tenancy.js
index de597eac01..8360198b60 100644
--- a/packages/backend-core/src/tenancy/tenancy.js
+++ b/packages/backend-core/src/tenancy/tenancy.js
@@ -1,6 +1,6 @@
const { getDB } = require("../db")
const { SEPARATOR, StaticDatabases, DocumentTypes } = require("../db/constants")
-const { getTenantId, DEFAULT_TENANT_ID, isMultiTenant } = require("./context")
+const { getTenantId, DEFAULT_TENANT_ID, isMultiTenant } = require("../context")
const env = require("../environment")
const TENANT_DOC = StaticDatabases.PLATFORM_INFO.docs.tenants
diff --git a/packages/backend-core/src/utils.js b/packages/backend-core/src/utils.js
index 6c71c51b9d..45fb4acd55 100644
--- a/packages/backend-core/src/utils.js
+++ b/packages/backend-core/src/utils.js
@@ -256,7 +256,7 @@ exports.saveUser = async (
exports.platformLogout = async ({ ctx, userId, keepActiveSession }) => {
if (!ctx) throw new Error("Koa context must be supplied to logout.")
- const currentSession = this.getCookie(ctx, Cookies.Auth)
+ const currentSession = exports.getCookie(ctx, Cookies.Auth)
let sessions = await getUserSessions(userId)
if (keepActiveSession) {
@@ -265,8 +265,8 @@ exports.platformLogout = async ({ ctx, userId, keepActiveSession }) => {
)
} else {
// clear cookies
- this.clearCookie(ctx, Cookies.Auth)
- this.clearCookie(ctx, Cookies.CurrentApp)
+ exports.clearCookie(ctx, Cookies.Auth)
+ exports.clearCookie(ctx, Cookies.CurrentApp)
}
await invalidateSessions(
diff --git a/packages/bbui/package.json b/packages/bbui/package.json
index 0f3ba64508..9fb080e8c7 100644
--- a/packages/bbui/package.json
+++ b/packages/bbui/package.json
@@ -1,7 +1,7 @@
{
"name": "@budibase/bbui",
"description": "A UI solution used in the different Budibase projects.",
- "version": "1.0.49-alpha.5",
+ "version": "1.0.50-alpha.6",
"license": "MPL-2.0",
"svelte": "src/index.js",
"module": "dist/bbui.es.js",
@@ -79,6 +79,7 @@
"@spectrum-css/underlay": "^2.0.9",
"@spectrum-css/vars": "^3.0.1",
"dayjs": "^1.10.4",
+ "easymde": "^2.16.1",
"svelte-flatpickr": "^3.2.3",
"svelte-portal": "^1.0.0"
},
diff --git a/packages/bbui/src/Button/Button.svelte b/packages/bbui/src/Button/Button.svelte
index da4d405f02..67930b8030 100644
--- a/packages/bbui/src/Button/Button.svelte
+++ b/packages/bbui/src/Button/Button.svelte
@@ -1,5 +1,6 @@
-
- {#if icon}
-
-
-
+
+
(showTooltip = true)}
+ on:mouseleave={() => (showTooltip = false)}
+ >
+ {#if icon}
+
+
+
+ {/if}
+ {#if $$slots}
+
+ {/if}
+ {#if !disabled && tooltip}
+
+
+
+
+
+ {/if}
+
+ {#if showTooltip && tooltip}
+
{/if}
- {#if $$slots}
-
- {/if}
-
+
diff --git a/packages/bbui/src/ColorPicker/ColorPicker.svelte b/packages/bbui/src/ColorPicker/ColorPicker.svelte
index ff6a292d1b..1fa950fadc 100644
--- a/packages/bbui/src/ColorPicker/ColorPicker.svelte
+++ b/packages/bbui/src/ColorPicker/ColorPicker.svelte
@@ -5,7 +5,7 @@
import { fly } from "svelte/transition"
import Icon from "../Icon/Icon.svelte"
import Input from "../Form/Input.svelte"
- import { capitalise } from "../utils/helpers"
+ import { capitalise } from "../helpers"
export let value
export let size = "M"
diff --git a/packages/bbui/src/Form/Core/DatePicker.svelte b/packages/bbui/src/Form/Core/DatePicker.svelte
index 8edb68a38e..c1c4cc866f 100644
--- a/packages/bbui/src/Form/Core/DatePicker.svelte
+++ b/packages/bbui/src/Form/Core/DatePicker.svelte
@@ -5,7 +5,7 @@
import "@spectrum-css/textfield/dist/index-vars.css"
import "@spectrum-css/picker/dist/index-vars.css"
import { createEventDispatcher } from "svelte"
- import { generateID } from "../../utils/helpers"
+ import { uuid } from "../../helpers"
export let id = null
export let disabled = false
@@ -14,16 +14,20 @@
export let value = null
export let placeholder = null
export let appendTo = undefined
+ export let timeOnly = false
const dispatch = createEventDispatcher()
- const flatpickrId = `${generateID()}-wrapper`
+ const flatpickrId = `${uuid()}-wrapper`
let open = false
- let flatpickr
+ let flatpickr, flatpickrOptions, isTimeOnly
+
+ $: isTimeOnly = !timeOnly && value ? !isNaN(new Date(`0-${value}`)) : timeOnly
$: flatpickrOptions = {
element: `#${flatpickrId}`,
- enableTime: enableTime || false,
+ enableTime: isTimeOnly || enableTime || false,
+ noCalendar: isTimeOnly || false,
altInput: true,
- altFormat: enableTime ? "F j Y, H:i" : "F j, Y",
+ altFormat: isTimeOnly ? "H:i" : enableTime ? "F j Y, H:i" : "F j, Y",
wrap: true,
appendTo,
disableMobile: "true",
@@ -35,6 +39,11 @@
if (newValue) {
newValue = newValue.toISOString()
}
+ // if time only set date component to today
+ if (timeOnly) {
+ const todayDate = new Date().toISOString().split("T")[0]
+ newValue = `${todayDate}T${newValue.split("T")[1]}`
+ }
dispatch("change", newValue)
}
@@ -67,7 +76,11 @@
return null
}
let date
- if (val instanceof Date) {
+ let time = new Date(`0-${val}`)
+ // it is a string like 00:00:00, just time
+ if (timeOnly || (typeof val === "string" && !isNaN(time))) {
+ date = time
+ } else if (val instanceof Date) {
// Use real date obj if already parsed
date = val
} else if (isNaN(val)) {
@@ -77,7 +90,7 @@
// Treat as numerical timestamp
date = new Date(parseInt(val))
}
- const time = date.getTime()
+ time = date.getTime()
if (isNaN(time)) {
return null
}
@@ -88,69 +101,71 @@
}
-
-
-
+
+{/key}
{#if open}
{/if}
diff --git a/packages/bbui/src/Form/Core/Dropzone.svelte b/packages/bbui/src/Form/Core/Dropzone.svelte
index 6b8022a36c..d739e751c9 100644
--- a/packages/bbui/src/Form/Core/Dropzone.svelte
+++ b/packages/bbui/src/Form/Core/Dropzone.svelte
@@ -3,7 +3,7 @@
import "@spectrum-css/typography/dist/index-vars.css"
import "@spectrum-css/illustratedmessage/dist/index-vars.css"
import { createEventDispatcher } from "svelte"
- import { generateID } from "../../utils/helpers"
+ import { uuid } from "../../helpers"
import Icon from "../../Icon/Icon.svelte"
import Link from "../../Link/Link.svelte"
import Tag from "../../Tags/Tag.svelte"
@@ -37,7 +37,7 @@
"jfif",
]
- const fieldId = id || generateID()
+ const fieldId = id || uuid()
let selectedImageIdx = 0
let fileDragged = false
let selectedUrl
diff --git a/packages/bbui/src/Form/Core/RichTextField.svelte b/packages/bbui/src/Form/Core/RichTextField.svelte
new file mode 100644
index 0000000000..f964405f0d
--- /dev/null
+++ b/packages/bbui/src/Form/Core/RichTextField.svelte
@@ -0,0 +1,42 @@
+
+
+
+
+
+
+
diff --git a/packages/bbui/src/Form/Core/TextArea.svelte b/packages/bbui/src/Form/Core/TextArea.svelte
index a022a98e5f..465212cd44 100644
--- a/packages/bbui/src/Form/Core/TextArea.svelte
+++ b/packages/bbui/src/Form/Core/TextArea.svelte
@@ -13,6 +13,7 @@
start: textarea.selectionStart,
end: textarea.selectionEnd,
})
+ export let align = null
let focus = false
let textarea
@@ -21,11 +22,23 @@
dispatch("change", event.target.value)
focus = false
}
+
+ const getStyleString = (attribute, value) => {
+ if (!attribute || value == null) {
+ return ""
+ }
+ if (isNaN(value)) {
+ return `${attribute}:${value};`
+ }
+ return `${attribute}:${value}px;`
+ }
+
+ $: heightString = getStyleString("height", height)
+ $: minHeightString = getStyleString("min-height", minHeight)
(focus = true)}
diff --git a/packages/bbui/src/Form/Core/TextField.svelte b/packages/bbui/src/Form/Core/TextField.svelte
index d2064ddde0..78b698eed2 100644
--- a/packages/bbui/src/Form/Core/TextField.svelte
+++ b/packages/bbui/src/Form/Core/TextField.svelte
@@ -12,6 +12,7 @@
export let updateOnChange = true
export let quiet = false
export let dataCy
+ export let align
const dispatch = createEventDispatcher()
let focus = false
@@ -92,8 +93,9 @@
on:input={onInput}
on:keyup={updateValueOnEnter}
{type}
- inputmode={type === "number" ? "decimal" : "text"}
class="spectrum-Textfield-input"
+ style={align ? `text-align: ${align};` : ""}
+ inputmode={type === "number" ? "decimal" : "text"}
/>
diff --git a/packages/bbui/src/Form/Core/index.js b/packages/bbui/src/Form/Core/index.js
index 440c4a1b15..3c3f9acb4d 100644
--- a/packages/bbui/src/Form/Core/index.js
+++ b/packages/bbui/src/Form/Core/index.js
@@ -10,3 +10,4 @@ export { default as CoreSearch } from "./Search.svelte"
export { default as CoreDatePicker } from "./DatePicker.svelte"
export { default as CoreDropzone } from "./Dropzone.svelte"
export { default as CoreStepper } from "./Stepper.svelte"
+export { default as CoreRichTextField } from "./RichTextField.svelte"
diff --git a/packages/bbui/src/Form/DatePicker.svelte b/packages/bbui/src/Form/DatePicker.svelte
index 7d5656a22d..9298c49177 100644
--- a/packages/bbui/src/Form/DatePicker.svelte
+++ b/packages/bbui/src/Form/DatePicker.svelte
@@ -9,6 +9,7 @@
export let disabled = false
export let error = null
export let enableTime = true
+ export let timeOnly = false
export let placeholder = null
export let appendTo = undefined
@@ -27,6 +28,7 @@
{value}
{placeholder}
{enableTime}
+ {timeOnly}
{appendTo}
on:change={onChange}
/>
diff --git a/packages/bbui/src/Form/RichTextField.svelte b/packages/bbui/src/Form/RichTextField.svelte
new file mode 100644
index 0000000000..275242df49
--- /dev/null
+++ b/packages/bbui/src/Form/RichTextField.svelte
@@ -0,0 +1,36 @@
+
+
+
+
+
diff --git a/packages/bbui/src/Markdown/MarkdownEditor.svelte b/packages/bbui/src/Markdown/MarkdownEditor.svelte
new file mode 100644
index 0000000000..7fb6414ad8
--- /dev/null
+++ b/packages/bbui/src/Markdown/MarkdownEditor.svelte
@@ -0,0 +1,60 @@
+
+
+{#key height}
+
+{/key}
diff --git a/packages/bbui/src/Markdown/MarkdownViewer.svelte b/packages/bbui/src/Markdown/MarkdownViewer.svelte
new file mode 100644
index 0000000000..5705020f45
--- /dev/null
+++ b/packages/bbui/src/Markdown/MarkdownViewer.svelte
@@ -0,0 +1,70 @@
+
+
+
+
+
+
+
diff --git a/packages/bbui/src/Markdown/SpectrumMDE.svelte b/packages/bbui/src/Markdown/SpectrumMDE.svelte
new file mode 100644
index 0000000000..9b0832c91f
--- /dev/null
+++ b/packages/bbui/src/Markdown/SpectrumMDE.svelte
@@ -0,0 +1,184 @@
+
+
+
+
+
+
+
diff --git a/packages/bbui/src/Notification/Notification.svelte b/packages/bbui/src/Notification/Notification.svelte
index cebc859bda..1d21131553 100644
--- a/packages/bbui/src/Notification/Notification.svelte
+++ b/packages/bbui/src/Notification/Notification.svelte
@@ -1,7 +1,12 @@
@@ -17,4 +22,28 @@
+ {#if dismissable}
+
+ {/if}
+
+
diff --git a/packages/bbui/src/Notification/NotificationDisplay.svelte b/packages/bbui/src/Notification/NotificationDisplay.svelte
index 9d96bf7e70..eb778f3aa0 100644
--- a/packages/bbui/src/Notification/NotificationDisplay.svelte
+++ b/packages/bbui/src/Notification/NotificationDisplay.svelte
@@ -1,7 +1,6 @@
-{dayjs(value).format("MMMM D YYYY, HH:mm")}
+
+ {dayjs(isTime ? time : value).format(
+ isTime ? "HH:mm:ss" : "MMMM D YYYY, HH:mm"
+ )}
+
diff --git a/packages/builder/src/components/design/PropertiesPanel/PropertyControls/ConditionalUIDrawer.svelte b/packages/builder/src/components/design/PropertiesPanel/PropertyControls/ConditionalUIDrawer.svelte
index e303729d0b..11d19edf7c 100644
--- a/packages/builder/src/components/design/PropertiesPanel/PropertyControls/ConditionalUIDrawer.svelte
+++ b/packages/builder/src/components/design/PropertiesPanel/PropertyControls/ConditionalUIDrawer.svelte
@@ -12,7 +12,7 @@
import { dndzone } from "svelte-dnd-action"
import { generate } from "shortid"
import DrawerBindableInput from "components/common/bindings/DrawerBindableInput.svelte"
- import { OperatorOptions, getValidOperatorsForType } from "constants/lucene"
+ import { LuceneUtils, Constants } from "@budibase/frontend-core"
import { selectedComponent } from "builderStore"
import { getComponentForSettingType } from "./componentSettings"
import PropertyControl from "./PropertyControl.svelte"
@@ -83,7 +83,7 @@
valueType: "string",
id: generate(),
action: "hide",
- operator: OperatorOptions.Equals.value,
+ operator: Constants.OperatorOptions.Equals.value,
},
]
}
@@ -108,13 +108,13 @@
}
const getOperatorOptions = condition => {
- return getValidOperatorsForType(condition.valueType)
+ return LuceneUtils.getValidOperatorsForType(condition.valueType)
}
const onOperatorChange = (condition, newOperator) => {
const noValueOptions = [
- OperatorOptions.Empty.value,
- OperatorOptions.NotEmpty.value,
+ Constants.OperatorOptions.Empty.value,
+ Constants.OperatorOptions.NotEmpty.value,
]
condition.noValue = noValueOptions.includes(newOperator)
if (condition.noValue) {
@@ -127,9 +127,12 @@
condition.referenceValue = null
// Ensure a valid operator is set
- const validOperators = getValidOperatorsForType(newType).map(x => x.value)
+ const validOperators = LuceneUtils.getValidOperatorsForType(newType).map(
+ x => x.value
+ )
if (!validOperators.includes(condition.operator)) {
- condition.operator = validOperators[0] ?? OperatorOptions.Equals.value
+ condition.operator =
+ validOperators[0] ?? Constants.OperatorOptions.Equals.value
onOperatorChange(condition, condition.operator)
}
}
diff --git a/packages/builder/src/components/design/PropertiesPanel/PropertyControls/FilterEditor/FilterDrawer.svelte b/packages/builder/src/components/design/PropertiesPanel/PropertyControls/FilterEditor/FilterDrawer.svelte
index ac97bf6065..ef56c610bd 100644
--- a/packages/builder/src/components/design/PropertiesPanel/PropertyControls/FilterEditor/FilterDrawer.svelte
+++ b/packages/builder/src/components/design/PropertiesPanel/PropertyControls/FilterEditor/FilterDrawer.svelte
@@ -13,7 +13,7 @@
import DrawerBindableInput from "components/common/bindings/DrawerBindableInput.svelte"
import ClientBindingPanel from "components/common/bindings/ClientBindingPanel.svelte"
import { generate } from "shortid"
- import { getValidOperatorsForType, OperatorOptions } from "constants/lucene"
+ import { LuceneUtils, Constants } from "@budibase/frontend-core"
import { getFields } from "helpers/searchFields"
export let schemaFields
@@ -32,7 +32,7 @@
{
id: generate(),
field: null,
- operator: OperatorOptions.Equals.value,
+ operator: Constants.OperatorOptions.Equals.value,
value: null,
valueType: "Value",
},
@@ -54,11 +54,12 @@
expression.type = enrichedSchemaFields.find(x => x.name === field)?.type
// Ensure a valid operator is set
- const validOperators = getValidOperatorsForType(expression.type).map(
- x => x.value
- )
+ const validOperators = LuceneUtils.getValidOperatorsForType(
+ expression.type
+ ).map(x => x.value)
if (!validOperators.includes(expression.operator)) {
- expression.operator = validOperators[0] ?? OperatorOptions.Equals.value
+ expression.operator =
+ validOperators[0] ?? Constants.OperatorOptions.Equals.value
onOperatorChange(expression, expression.operator)
}
@@ -73,8 +74,8 @@
const onOperatorChange = (expression, operator) => {
const noValueOptions = [
- OperatorOptions.Empty.value,
- OperatorOptions.NotEmpty.value,
+ Constants.OperatorOptions.Empty.value,
+ Constants.OperatorOptions.NotEmpty.value,
]
expression.noValue = noValueOptions.includes(operator)
if (expression.noValue) {
@@ -110,7 +111,7 @@
/>
onOperatorChange(filter, e.detail)}
placeholder={null}
diff --git a/packages/builder/src/components/design/PropertiesPanel/PropertyControls/ResetFieldsButton.svelte b/packages/builder/src/components/design/PropertiesPanel/PropertyControls/ResetFieldsButton.svelte
index fa2a0d6088..a76a93d7f6 100644
--- a/packages/builder/src/components/design/PropertiesPanel/PropertyControls/ResetFieldsButton.svelte
+++ b/packages/builder/src/components/design/PropertiesPanel/PropertyControls/ResetFieldsButton.svelte
@@ -1,5 +1,5 @@
diff --git a/packages/builder/src/components/design/PropertiesPanel/PropertyControls/SearchFieldSelect.svelte b/packages/builder/src/components/design/PropertiesPanel/PropertyControls/SearchFieldSelect.svelte
index 474fbc676c..e609426b1e 100644
--- a/packages/builder/src/components/design/PropertiesPanel/PropertyControls/SearchFieldSelect.svelte
+++ b/packages/builder/src/components/design/PropertiesPanel/PropertyControls/SearchFieldSelect.svelte
@@ -25,7 +25,7 @@
return base
}
const currentTable = $tables.list.find(table => table._id === ds.tableId)
- return getFields(base, { allowLinks: currentTable.sql }).map(
+ return getFields(base, { allowLinks: currentTable?.sql }).map(
field => field.name
)
}
diff --git a/packages/builder/src/components/design/PropertiesPanel/ScreenSettingsSection.svelte b/packages/builder/src/components/design/PropertiesPanel/ScreenSettingsSection.svelte
index 4a7c77746e..ded80a7d5c 100644
--- a/packages/builder/src/components/design/PropertiesPanel/ScreenSettingsSection.svelte
+++ b/packages/builder/src/components/design/PropertiesPanel/ScreenSettingsSection.svelte
@@ -1,7 +1,7 @@
@@ -34,7 +42,7 @@
control={prop.control}
key={prop.key}
value={style[prop.key]}
- onChange={val => store.actions.components.updateStyle(prop.key, val)}
+ onChange={val => updateStyle(prop.key, val)}
props={getControlProps(prop)}
{bindings}
/>
diff --git a/packages/builder/src/components/feedback/NPSFeedbackForm.svelte b/packages/builder/src/components/feedback/NPSFeedbackForm.svelte
index 4c5bb46c63..6a6e52ec74 100644
--- a/packages/builder/src/components/feedback/NPSFeedbackForm.svelte
+++ b/packages/builder/src/components/feedback/NPSFeedbackForm.svelte
@@ -13,6 +13,7 @@
Detail,
Divider,
Layout,
+ notifications,
} from "@budibase/bbui"
import { auth } from "stores/portal"
@@ -45,20 +46,28 @@
improvements,
comment,
})
- auth.updateSelf({
- flags: {
- feedbackSubmitted: true,
- },
- })
+ try {
+ auth.updateSelf({
+ flags: {
+ feedbackSubmitted: true,
+ },
+ })
+ } catch (error) {
+ notifications.error("Error updating user")
+ }
dispatch("complete")
}
function cancelFeedback() {
- auth.updateSelf({
- flags: {
- feedbackSubmitted: true,
- },
- })
+ try {
+ auth.updateSelf({
+ flags: {
+ feedbackSubmitted: true,
+ },
+ })
+ } catch (error) {
+ notifications.error("Error updating user")
+ }
dispatch("complete")
}
diff --git a/packages/builder/src/components/integration/AccessLevelSelect.svelte b/packages/builder/src/components/integration/AccessLevelSelect.svelte
index 97587c287a..59f6b8a105 100644
--- a/packages/builder/src/components/integration/AccessLevelSelect.svelte
+++ b/packages/builder/src/components/integration/AccessLevelSelect.svelte
@@ -1,5 +1,5 @@
diff --git a/packages/builder/src/components/start/ChooseIconModal.svelte b/packages/builder/src/components/start/ChooseIconModal.svelte
index 4efb679a51..b2f68c6ce7 100644
--- a/packages/builder/src/components/start/ChooseIconModal.svelte
+++ b/packages/builder/src/components/start/ChooseIconModal.svelte
@@ -1,5 +1,12 @@
diff --git a/packages/builder/src/components/start/CreateAppModal.svelte b/packages/builder/src/components/start/CreateAppModal.svelte
index 3efd0231aa..91c4807dc8 100644
--- a/packages/builder/src/components/start/CreateAppModal.svelte
+++ b/packages/builder/src/components/start/CreateAppModal.svelte
@@ -2,8 +2,8 @@
import { writable, get as svelteGet } from "svelte/store"
import { notifications, Input, ModalContent, Dropzone } from "@budibase/bbui"
import { store, automationStore } from "builderStore"
+ import { API } from "api"
import { apps, admin, auth } from "stores/portal"
- import api, { get, post } from "builderStore/api"
import analytics, { Events } from "analytics"
import { onMount } from "svelte"
import { goto } from "@roxi/routify"
@@ -45,43 +45,27 @@
}
// Create App
- const appResp = await post("/api/applications", data, {})
- const appJson = await appResp.json()
- if (!appResp.ok) {
- throw new Error(appJson.message)
- }
-
+ const createdApp = await API.createApp(data)
analytics.captureEvent(Events.APP.CREATED, {
name: $values.name,
- appId: appJson.instance._id,
+ appId: createdApp.instance._id,
templateToUse: template,
})
// Select Correct Application/DB in prep for creating user
- const applicationPkg = await get(
- `/api/applications/${appJson.instance._id}/appPackage`
- )
- const pkg = await applicationPkg.json()
- if (applicationPkg.ok) {
- await store.actions.initialise(pkg)
- await automationStore.actions.fetch()
- // update checklist - incase first app
- await admin.init()
- } else {
- throw new Error(pkg)
- }
+ const pkg = await API.fetchAppPackage(createdApp.instance._id)
+ await store.actions.initialise(pkg)
+ await automationStore.actions.fetch()
+ // Update checklist - in case first app
+ await admin.init()
// Create user
- const user = {
- roleId: $values.roleId,
- }
- const userResp = await api.post(`/api/users/metadata/self`, user)
- await userResp.json()
+ await API.updateOwnMetadata({ roleId: $values.roleId })
await auth.setInitInfo({})
- $goto(`/builder/app/${appJson.instance._id}`)
+ $goto(`/builder/app/${createdApp.instance._id}`)
} catch (error) {
console.error(error)
- notifications.error(error)
+ notifications.error("Error creating app")
}
}
diff --git a/packages/builder/src/components/start/UpdateAppModal.svelte b/packages/builder/src/components/start/UpdateAppModal.svelte
index 7549876fc0..1ce699b834 100644
--- a/packages/builder/src/components/start/UpdateAppModal.svelte
+++ b/packages/builder/src/components/start/UpdateAppModal.svelte
@@ -38,7 +38,7 @@
await apps.update(app.instance._id, body)
} catch (error) {
console.error(error)
- notifications.error(error)
+ notifications.error("Error updating app")
}
}
diff --git a/packages/builder/src/constants/lucene.js b/packages/builder/src/constants/lucene.js
deleted file mode 100644
index 8a6bf57b5f..0000000000
--- a/packages/builder/src/constants/lucene.js
+++ /dev/null
@@ -1,96 +0,0 @@
-/**
- * Operator options for lucene queries
- */
-export const OperatorOptions = {
- Equals: {
- value: "equal",
- label: "Equals",
- },
- NotEquals: {
- value: "notEqual",
- label: "Not equals",
- },
- Empty: {
- value: "empty",
- label: "Is empty",
- },
- NotEmpty: {
- value: "notEmpty",
- label: "Is not empty",
- },
- StartsWith: {
- value: "string",
- label: "Starts with",
- },
- Like: {
- value: "fuzzy",
- label: "Like",
- },
- MoreThan: {
- value: "rangeLow",
- label: "More than",
- },
- LessThan: {
- value: "rangeHigh",
- label: "Less than",
- },
- Contains: {
- value: "equal",
- label: "Contains",
- },
- NotContains: {
- value: "notEqual",
- label: "Does Not Contain",
- },
-}
-
-export const NoEmptyFilterStrings = [
- OperatorOptions.StartsWith.value,
- OperatorOptions.Like.value,
- OperatorOptions.Equals.value,
- OperatorOptions.NotEquals.value,
- OperatorOptions.Contains.value,
- OperatorOptions.NotContains.value,
-]
-
-/**
- * Returns the valid operator options for a certain data type
- * @param type the data type
- */
-export const getValidOperatorsForType = type => {
- const Op = OperatorOptions
- const stringOps = [
- Op.Equals,
- Op.NotEquals,
- Op.StartsWith,
- Op.Like,
- Op.Empty,
- Op.NotEmpty,
- ]
- const numOps = [
- Op.Equals,
- Op.NotEquals,
- Op.MoreThan,
- Op.LessThan,
- Op.Empty,
- Op.NotEmpty,
- ]
- if (type === "string") {
- return stringOps
- } else if (type === "number") {
- return numOps
- } else if (type === "options") {
- return [Op.Equals, Op.NotEquals, Op.Empty, Op.NotEmpty]
- } else if (type === "array") {
- return [Op.Contains, Op.NotContains, Op.Empty, Op.NotEmpty]
- } else if (type === "boolean") {
- return [Op.Equals, Op.NotEquals, Op.Empty, Op.NotEmpty]
- } else if (type === "longform") {
- return stringOps
- } else if (type === "datetime") {
- return numOps
- } else if (type === "formula") {
- return stringOps.concat([Op.MoreThan, Op.LessThan])
- }
- return []
-}
diff --git a/packages/builder/src/helpers/fetchData.js b/packages/builder/src/helpers/fetchData.js
index 65061f6b6a..9208419c4e 100644
--- a/packages/builder/src/helpers/fetchData.js
+++ b/packages/builder/src/helpers/fetchData.js
@@ -1,5 +1,5 @@
import { writable } from "svelte/store"
-import api from "builderStore/api"
+import { API } from "api"
export default function (url) {
const store = writable({ status: "LOADING", data: {}, error: {} })
@@ -7,8 +7,8 @@ export default function (url) {
async function get() {
store.update(u => ({ ...u, status: "LOADING" }))
try {
- const response = await api.get(url)
- store.set({ data: await response.json(), status: "SUCCESS" })
+ const data = await API.get({ url })
+ store.set({ data, status: "SUCCESS" })
} catch (e) {
store.set({ data: {}, error: e, status: "ERROR" })
}
diff --git a/packages/builder/src/helpers/fetchTableData.js b/packages/builder/src/helpers/fetchTableData.js
deleted file mode 100644
index 6d61ec813e..0000000000
--- a/packages/builder/src/helpers/fetchTableData.js
+++ /dev/null
@@ -1,210 +0,0 @@
-// Do not use any aliased imports in common files, as these will be bundled
-// by multiple bundlers which may not be able to resolve them.
-// This will eventually be replaced by the new client implementation when we
-// add a core package.
-import { writable, derived, get } from "svelte/store"
-import * as API from "../builderStore/api"
-import { buildLuceneQuery } from "./lucene"
-
-const defaultOptions = {
- tableId: null,
- filters: null,
- limit: 10,
- sortColumn: null,
- sortOrder: "ascending",
- paginate: true,
- schema: null,
-}
-
-export const fetchTableData = opts => {
- // Save option set so we can override it later rather than relying on params
- let options = {
- ...defaultOptions,
- ...opts,
- }
-
- // Local non-observable state
- let query
- let sortType
- let lastBookmark
-
- // Local observable state
- const store = writable({
- rows: [],
- schema: null,
- loading: false,
- loaded: false,
- bookmarks: [],
- pageNumber: 0,
- })
-
- // Derive certain properties to return
- const derivedStore = derived(store, $store => {
- return {
- ...$store,
- hasNextPage: $store.bookmarks[$store.pageNumber + 1] != null,
- hasPrevPage: $store.pageNumber > 0,
- }
- })
-
- const fetchPage = async bookmark => {
- lastBookmark = bookmark
- const { tableId, limit, sortColumn, sortOrder, paginate } = options
- const res = await API.post(`/api/${options.tableId}/search`, {
- tableId,
- query,
- limit,
- sort: sortColumn,
- sortOrder: sortOrder?.toLowerCase() ?? "ascending",
- sortType,
- paginate,
- bookmark,
- })
- return await res.json()
- }
-
- // Fetches a fresh set of results from the server
- const fetchData = async () => {
- const { tableId, schema, sortColumn, filters } = options
-
- // Ensure table ID exists
- if (!tableId) {
- return
- }
-
- // Get and enrich schema.
- // Ensure there are "name" properties for all fields and that field schema
- // are objects
- let enrichedSchema = schema
- if (!enrichedSchema) {
- const definition = await API.get(`/api/tables/${tableId}`)
- enrichedSchema = definition?.schema ?? null
- }
- if (enrichedSchema) {
- Object.entries(schema).forEach(([fieldName, fieldSchema]) => {
- if (typeof fieldSchema === "string") {
- enrichedSchema[fieldName] = {
- type: fieldSchema,
- name: fieldName,
- }
- } else {
- enrichedSchema[fieldName] = {
- ...fieldSchema,
- name: fieldName,
- }
- }
- })
-
- // Save fixed schema so we can provide it later
- options.schema = enrichedSchema
- }
-
- // Ensure schema exists
- if (!schema) {
- return
- }
- store.update($store => ({ ...$store, schema, loading: true }))
-
- // Work out what sort type to use
- if (!sortColumn || !schema[sortColumn]) {
- sortType = "string"
- }
- const type = schema?.[sortColumn]?.type
- sortType = type === "number" ? "number" : "string"
-
- // Build the lucene query
- query = buildLuceneQuery(filters)
-
- // Actually fetch data
- const page = await fetchPage()
- store.update($store => ({
- ...$store,
- loading: false,
- loaded: true,
- pageNumber: 0,
- rows: page.rows,
- bookmarks: page.hasNextPage ? [null, page.bookmark] : [null],
- }))
- }
-
- // Fetches the next page of data
- const nextPage = async () => {
- const state = get(derivedStore)
- if (state.loading || !options.paginate || !state.hasNextPage) {
- return
- }
-
- // Fetch next page
- store.update($store => ({ ...$store, loading: true }))
- const page = await fetchPage(state.bookmarks[state.pageNumber + 1])
-
- // Update state
- store.update($store => {
- let { bookmarks, pageNumber } = $store
- if (page.hasNextPage) {
- bookmarks[pageNumber + 2] = page.bookmark
- }
- return {
- ...$store,
- pageNumber: pageNumber + 1,
- rows: page.rows,
- bookmarks,
- loading: false,
- }
- })
- }
-
- // Fetches the previous page of data
- const prevPage = async () => {
- const state = get(derivedStore)
- if (state.loading || !options.paginate || !state.hasPrevPage) {
- return
- }
-
- // Fetch previous page
- store.update($store => ({ ...$store, loading: true }))
- const page = await fetchPage(state.bookmarks[state.pageNumber - 1])
-
- // Update state
- store.update($store => {
- return {
- ...$store,
- pageNumber: $store.pageNumber - 1,
- rows: page.rows,
- loading: false,
- }
- })
- }
-
- // Resets the data set and updates options
- const update = async newOptions => {
- if (newOptions) {
- options = {
- ...options,
- ...newOptions,
- }
- }
- await fetchData()
- }
-
- // Loads the same page again
- const refresh = async () => {
- if (get(store).loading) {
- return
- }
- const page = await fetchPage(lastBookmark)
- store.update($store => ({ ...$store, rows: page.rows }))
- }
-
- // Initially fetch data but don't bother waiting for the result
- fetchData()
-
- // Return our derived store which will be updated over time
- return {
- subscribe: derivedStore.subscribe,
- nextPage,
- prevPage,
- update,
- refresh,
- }
-}
diff --git a/packages/builder/src/pages/builder/_layout.svelte b/packages/builder/src/pages/builder/_layout.svelte
index 1d41af15e7..cb760cd165 100644
--- a/packages/builder/src/pages/builder/_layout.svelte
+++ b/packages/builder/src/pages/builder/_layout.svelte
@@ -2,12 +2,7 @@
import { isActive, redirect, params } from "@roxi/routify"
import { admin, auth } from "stores/portal"
import { onMount } from "svelte"
- import {
- Cookies,
- getCookie,
- removeCookie,
- setCookie,
- } from "builderStore/cookies"
+ import { CookieUtils, Constants } from "@budibase/frontend-core"
let loaded = false
@@ -46,9 +41,12 @@
if (user.tenantId !== urlTenantId) {
// user should not be here - play it safe and log them out
- await auth.logout()
- await auth.setOrganisation(null)
- return
+ try {
+ await auth.logout()
+ await auth.setOrganisation(null)
+ } catch (error) {
+ // Swallow error and do nothing
+ }
}
} else {
// no user - set the org according to the url
@@ -57,17 +55,23 @@
}
onMount(async () => {
- if ($params["?template"]) {
- await auth.setInitInfo({ init_template: $params["?template"] })
+ try {
+ await auth.getSelf()
+ await admin.init()
+
+ // Set init info if present
+ if ($params["?template"]) {
+ await auth.setInitInfo({ init_template: $params["?template"] })
+ }
+
+ // Validate tenant if in a multi-tenant env
+ if (useAccountPortal && multiTenancyEnabled) {
+ await validateTenantId()
+ }
+ } catch (error) {
+ // Don't show a notification here, as we might 403 initially due to not
+ // being logged in
}
-
- await auth.getSelf()
- await admin.init()
-
- if (useAccountPortal && multiTenancyEnabled) {
- await validateTenantId()
- }
-
loaded = true
})
@@ -79,7 +83,7 @@
loaded &&
apiReady &&
!$auth.user &&
- !getCookie(Cookies.ReturnUrl) &&
+ !CookieUtils.getCookie(Constants.Cookies.ReturnUrl) &&
// logout triggers a page refresh, so we don't want to set the return url
!$auth.postLogout &&
// don't set the return url on pre-login pages
@@ -88,7 +92,7 @@
!$isActive("./admin")
) {
const url = window.location.pathname
- setCookie(Cookies.ReturnUrl, url)
+ CookieUtils.setCookie(Constants.Cookies.ReturnUrl, url)
}
// if tenant is not set go to it
@@ -122,9 +126,9 @@
}
// lastly, redirect to the return url if it has been set
else if (loaded && apiReady && $auth.user) {
- const returnUrl = getCookie(Cookies.ReturnUrl)
+ const returnUrl = CookieUtils.getCookie(Constants.Cookies.ReturnUrl)
if (returnUrl) {
- removeCookie(Cookies.ReturnUrl)
+ CookieUtils.removeCookie(Constants.Cookies.ReturnUrl)
window.location.href = returnUrl
}
}
diff --git a/packages/builder/src/pages/builder/admin/_components/ImportAppsModal.svelte b/packages/builder/src/pages/builder/admin/_components/ImportAppsModal.svelte
index de29e11301..182df63967 100644
--- a/packages/builder/src/pages/builder/admin/_components/ImportAppsModal.svelte
+++ b/packages/builder/src/pages/builder/admin/_components/ImportAppsModal.svelte
@@ -1,6 +1,6 @@
@@ -36,10 +31,10 @@
onConfirm={importApps}
disabled={!value.file}
>
- Please upload the file that was exported from your Cloud environment to get
- started
+
+ Please upload the file that was exported from your Cloud environment to get
+ started
+
{
if (!cloud) {
- await admin.checkImportComplete()
+ try {
+ await admin.checkImportComplete()
+ } catch (error) {
+ notifications.error("Error checking import status")
+ }
}
})
diff --git a/packages/builder/src/pages/builder/app/[application]/_layout.svelte b/packages/builder/src/pages/builder/app/[application]/_layout.svelte
index 0478b46f73..1003936214 100644
--- a/packages/builder/src/pages/builder/app/[application]/_layout.svelte
+++ b/packages/builder/src/pages/builder/app/[application]/_layout.svelte
@@ -6,25 +6,26 @@
import RevertModal from "components/deploy/RevertModal.svelte"
import VersionModal from "components/deploy/VersionModal.svelte"
import NPSFeedbackForm from "components/feedback/NPSFeedbackForm.svelte"
- import { get, post } from "builderStore/api"
+ import { API } from "api"
import { auth, admin } from "stores/portal"
import { isActive, goto, layout, redirect } from "@roxi/routify"
import Logo from "assets/bb-emblem.svg"
import { capitalise } from "helpers"
- import UpgradeModal from "../../../../components/upgrade/UpgradeModal.svelte"
- import { onMount } from "svelte"
+ import UpgradeModal from "components/upgrade/UpgradeModal.svelte"
+ import { onMount, onDestroy } from "svelte"
+
+ export let application
// Get Package and set store
- export let application
let promise = getPackage()
- // sync once when you load the app
+
+ // Sync once when you load the app
let hasSynced = false
+ let userShouldPostFeedback = false
$: selected = capitalise(
$layout.children.find(layout => $isActive(layout.path))?.title ?? "data"
)
- let userShouldPostFeedback = false
-
function previewApp() {
if (!$auth?.user?.flags?.feedbackSubmitted) {
userShouldPostFeedback = true
@@ -33,34 +34,24 @@
}
async function getPackage() {
- const res = await get(`/api/applications/${application}/appPackage`)
- const pkg = await res.json()
-
- if (res.ok) {
- try {
- await store.actions.initialise(pkg)
- // edge case, lock wasn't known to client when it re-directed, or user went directly
- } catch (err) {
- if (!err.ok && err.reason === "locked") {
- $redirect("../../")
- } else {
- throw err
- }
- }
+ try {
+ const pkg = await API.fetchAppPackage(application)
+ await store.actions.initialise(pkg)
await automationStore.actions.fetch()
await roles.fetch()
await flags.fetch()
return pkg
- } else {
- throw new Error(pkg)
+ } catch (error) {
+ notifications.error(`Error initialising app: ${error?.message}`)
+ $redirect("../../")
}
}
- // handles navigation between frontend, backend, automation.
- // this remembers your last place on each of the sections
+ // Handles navigation between frontend, backend, automation.
+ // This remembers your last place on each of the sections
// e.g. if one of your screens is selected on front end, then
// you browse to backend, when you click frontend, you will be
- // brought back to the same screen
+ // brought back to the same screen.
const topItemNavigate = path => () => {
const activeTopNav = $layout.children.find(c => $isActive(c.path))
if (!activeTopNav) return
@@ -74,13 +65,18 @@
onMount(async () => {
if (!hasSynced && application) {
- const res = await post(`/api/applications/${application}/sync`)
- if (res.status !== 200) {
+ try {
+ await API.syncApp(application)
+ } catch (error) {
notifications.error("Failed to sync with production database")
}
hasSynced = true
}
})
+
+ onDestroy(() => {
+ store.actions.reset()
+ })
{#await promise}
diff --git a/packages/builder/src/pages/builder/app/[application]/data/datasource/[selectedDatasource]/rest/[query]/index.svelte b/packages/builder/src/pages/builder/app/[application]/data/datasource/[selectedDatasource]/rest/[query]/index.svelte
index 808c3a49ec..a0df3a9d07 100644
--- a/packages/builder/src/pages/builder/app/[application]/data/datasource/[selectedDatasource]/rest/[query]/index.svelte
+++ b/packages/builder/src/pages/builder/app/[application]/data/datasource/[selectedDatasource]/rest/[query]/index.svelte
@@ -59,6 +59,9 @@
$: schemaReadOnly = !responseSuccess
$: variablesReadOnly = !responseSuccess
$: showVariablesTab = shouldShowVariables(dynamicVariables, variablesReadOnly)
+ $: hasSchema =
+ Object.keys(schema || {}).length !== 0 ||
+ Object.keys(query?.schema || {}).length !== 0
function getSelectedQuery() {
return cloneDeep(
@@ -112,14 +115,13 @@
const { _id } = await queries.save(toSave.datasourceId, toSave)
saveId = _id
query = getSelectedQuery()
- notifications.success(`Request saved successfully.`)
-
+ notifications.success(`Request saved successfully`)
if (dynamicVariables) {
datasource.config.dynamicVariables = rebuildVariables(saveId)
datasource = await datasources.save(datasource)
}
} catch (err) {
- notifications.error(`Error saving query. ${err.message}`)
+ notifications.error(`Error saving query`)
}
}
@@ -127,14 +129,14 @@
try {
response = await queries.preview(buildQuery(query))
if (response.rows.length === 0) {
- notifications.info("Request did not return any data.")
+ notifications.info("Request did not return any data")
} else {
response.info = response.info || { code: 200 }
schema = response.schema
- notifications.success("Request sent successfully.")
+ notifications.success("Request sent successfully")
}
- } catch (err) {
- notifications.error(err)
+ } catch (error) {
+ notifications.error("Error running query")
}
}
@@ -226,10 +228,24 @@
)
}
+ const updateFlag = async (flag, value) => {
+ try {
+ await flags.updateFlag(flag, value)
+ } catch (error) {
+ notifications.error("Error updating flag")
+ }
+ }
+
onMount(async () => {
query = getSelectedQuery()
- // clear any unsaved changes to the datasource
- await datasources.init()
+
+ try {
+ // Clear any unsaved changes to the datasource
+ await datasources.init()
+ } catch (error) {
+ notifications.error("Error getting datasources")
+ }
+
datasource = $datasources.list.find(ds => ds._id === query?.datasourceId)
const datasourceUrl = datasource?.config.url
const qs = query?.fields.queryString
@@ -294,6 +310,7 @@
bind:value={query.name}
defaultValue="Untitled"
on:change={() => (query.flags.urlName = false)}
+ on:save={saveQuery}
/>
Access level
@@ -313,7 +330,15 @@
-
Send
+
Send
+
Save
@@ -393,8 +418,7 @@
window.open(
"https://docs.budibase.com/building-apps/data/transformers"
)}
- on:change={() =>
- flags.updateFlag("queryTransformerBanner", true)}
+ on:change={() => updateFlag("queryTransformerBanner", true)}
>
Add a JavaScript function to transform the query result.
@@ -527,9 +551,6 @@
>{response?.info.size}
- Save query
{/if}
diff --git a/packages/builder/src/pages/builder/apps/index.svelte b/packages/builder/src/pages/builder/apps/index.svelte
index c98e749e45..39cc780ac7 100644
--- a/packages/builder/src/pages/builder/apps/index.svelte
+++ b/packages/builder/src/pages/builder/apps/index.svelte
@@ -10,6 +10,7 @@
Icon,
Body,
Modal,
+ notifications,
} from "@budibase/bbui"
import { onMount } from "svelte"
import { apps, organisation, auth } from "stores/portal"
@@ -26,8 +27,12 @@
let changePasswordModal
onMount(async () => {
- await organisation.init()
- await apps.load()
+ try {
+ await organisation.init()
+ await apps.load()
+ } catch (error) {
+ notifications.error("Error loading apps")
+ }
loaded = true
})
@@ -47,6 +52,14 @@
return `/${app.prodId}`
}
}
+
+ const logout = async () => {
+ try {
+ await auth.logout()
+ } catch (error) {
+ // Swallow error and do nothing
+ }
+ }
{#if $auth.user && loaded}
@@ -82,7 +95,7 @@
Open developer mode
{/if}
- Log out
+ Log out
diff --git a/packages/builder/src/pages/builder/auth/_components/OIDCButton.svelte b/packages/builder/src/pages/builder/auth/_components/OIDCButton.svelte
index bae68b6548..27f5bde186 100644
--- a/packages/builder/src/pages/builder/auth/_components/OIDCButton.svelte
+++ b/packages/builder/src/pages/builder/auth/_components/OIDCButton.svelte
@@ -1,5 +1,5 @@
diff --git a/packages/builder/src/pages/builder/auth/index.svelte b/packages/builder/src/pages/builder/auth/index.svelte
index a2a02e65c1..72b3a8c7cf 100644
--- a/packages/builder/src/pages/builder/auth/index.svelte
+++ b/packages/builder/src/pages/builder/auth/index.svelte
@@ -2,6 +2,7 @@
import { redirect } from "@roxi/routify"
import { auth, admin } from "stores/portal"
import { onMount } from "svelte"
+ import { notifications } from "@budibase/bbui"
$: tenantSet = $auth.tenantSet
$: multiTenancyEnabled = $admin.multiTenancy
@@ -17,8 +18,12 @@
}
onMount(async () => {
- await admin.init()
- await auth.checkQueryString()
+ try {
+ await admin.init()
+ await auth.checkQueryString()
+ } catch (error) {
+ notifications.error("Error getting checklist")
+ }
loaded = true
})
diff --git a/packages/builder/src/pages/builder/auth/login.svelte b/packages/builder/src/pages/builder/auth/login.svelte
index 7a13164c51..d9151b4342 100644
--- a/packages/builder/src/pages/builder/auth/login.svelte
+++ b/packages/builder/src/pages/builder/auth/login.svelte
@@ -31,7 +31,6 @@
username,
password,
})
-
if ($auth?.user?.forceResetPassword) {
$goto("./reset")
} else {
@@ -39,8 +38,7 @@
$goto("../portal")
}
} catch (err) {
- console.error(err)
- notifications.error(err.message ? err.message : "Invalid Credentials")
+ notifications.error(err.message ? err.message : "Invalid credentials")
}
}
@@ -49,7 +47,11 @@
}
onMount(async () => {
- await organisation.init()
+ try {
+ await organisation.init()
+ } catch (error) {
+ notifications.error("Error getting org config")
+ }
loaded = true
})
diff --git a/packages/builder/src/pages/builder/auth/org.svelte b/packages/builder/src/pages/builder/auth/org.svelte
index 5a484b6c93..8fd94463d9 100644
--- a/packages/builder/src/pages/builder/auth/org.svelte
+++ b/packages/builder/src/pages/builder/auth/org.svelte
@@ -1,5 +1,13 @@
diff --git a/packages/builder/src/pages/builder/invite/index.svelte b/packages/builder/src/pages/builder/invite/index.svelte
index ddf888ad73..c4745d8737 100644
--- a/packages/builder/src/pages/builder/invite/index.svelte
+++ b/packages/builder/src/pages/builder/invite/index.svelte
@@ -10,14 +10,11 @@
async function acceptInvite() {
try {
- const res = await users.acceptInvite(inviteCode, password)
- if (!res) {
- throw new Error(res.message)
- }
- notifications.success(`User created.`)
+ await users.acceptInvite(inviteCode, password)
+ notifications.success("Invitation accepted successfully")
$goto("../auth/login")
- } catch (err) {
- notifications.error(err)
+ } catch (error) {
+ notifications.error("Error accepting invitation")
}
}
diff --git a/packages/builder/src/pages/builder/portal/_layout.svelte b/packages/builder/src/pages/builder/portal/_layout.svelte
index 8fca18d29d..f4679647ff 100644
--- a/packages/builder/src/pages/builder/portal/_layout.svelte
+++ b/packages/builder/src/pages/builder/portal/_layout.svelte
@@ -10,6 +10,7 @@
MenuItem,
Modal,
clickOutside,
+ notifications,
} from "@budibase/bbui"
import ConfigChecklist from "components/common/ConfigChecklist.svelte"
import { organisation, auth } from "stores/portal"
@@ -78,6 +79,14 @@
return menu
}
+ const logout = async () => {
+ try {
+ await auth.logout()
+ } catch (error) {
+ // Swallow error and do nothing
+ }
+ }
+
const showMobileMenu = () => (mobileMenuVisible = true)
const hideMobileMenu = () => (mobileMenuVisible = false)
@@ -87,7 +96,11 @@
if (!$auth.user?.builder?.global) {
$redirect("../")
} else {
- await organisation.init()
+ try {
+ await organisation.init()
+ } catch (error) {
+ notifications.error("Error getting org config")
+ }
loaded = true
}
}
@@ -158,7 +171,7 @@
$goto("../apps")}>
Close developer mode
- Log out
+ Log out
diff --git a/packages/builder/src/pages/builder/portal/apps/index.svelte b/packages/builder/src/pages/builder/portal/apps/index.svelte
index bf783fdb86..b05aa1b659 100644
--- a/packages/builder/src/pages/builder/portal/apps/index.svelte
+++ b/packages/builder/src/pages/builder/portal/apps/index.svelte
@@ -19,7 +19,7 @@
import ChooseIconModal from "components/start/ChooseIconModal.svelte"
import { store, automationStore } from "builderStore"
- import api, { del, post, get } from "builderStore/api"
+ import { API } from "api"
import { onMount } from "svelte"
import { apps, auth, admin, templates } from "stores/portal"
import download from "downloadjs"
@@ -115,43 +115,29 @@
data.append("templateKey", template.key)
// Create App
- const appResp = await post("/api/applications", data, {})
- const appJson = await appResp.json()
- if (!appResp.ok) {
- throw new Error(appJson.message)
- }
-
+ const createdApp = await API.createApp(data)
analytics.captureEvent(Events.APP.CREATED, {
name: appName,
- appId: appJson.instance._id,
+ appId: createdApp.instance._id,
template,
fromTemplateMarketplace: true,
})
// Select Correct Application/DB in prep for creating user
- const applicationPkg = await get(
- `/api/applications/${appJson.instance._id}/appPackage`
- )
- const pkg = await applicationPkg.json()
- if (applicationPkg.ok) {
- await store.actions.initialise(pkg)
- await automationStore.actions.fetch()
- // update checklist - incase first app
- await admin.init()
- } else {
- throw new Error(pkg)
- }
+ const pkg = await API.fetchAppPackage(createdApp.instance._id)
+ await store.actions.initialise(pkg)
+ await automationStore.actions.fetch()
+ // Update checklist - in case first app
+ await admin.init()
// Create user
- const userResp = await api.post(`/api/users/metadata/self`, {
+ await API.updateOwnMetadata({
roleId: "BASIC",
})
- await userResp.json()
await auth.setInitInfo({})
- $goto(`/builder/app/${appJson.instance._id}`)
+ $goto(`/builder/app/${createdApp.instance._id}`)
} catch (error) {
- console.error(error)
- notifications.error(error)
+ notifications.error("Error creating app")
}
}
@@ -199,17 +185,11 @@
return
}
try {
- const response = await del(
- `/api/applications/${selectedApp.prodId}?unpublish=1`
- )
- if (response.status !== 200) {
- const json = await response.json()
- throw json.message
- }
+ await API.unpublishApp(selectedApp.prodId)
await apps.load()
notifications.success("App unpublished successfully")
} catch (err) {
- notifications.error(`Error unpublishing app: ${err}`)
+ notifications.error("Error unpublishing app")
}
}
@@ -223,17 +203,13 @@
return
}
try {
- const response = await del(`/api/applications/${selectedApp?.devId}`)
- if (response.status !== 200) {
- const json = await response.json()
- throw json.message
- }
+ await API.deleteApp(selectedApp?.devId)
await apps.load()
- // get checklist, just in case that was the last app
+ // Get checklist, just in case that was the last app
await admin.init()
notifications.success("App deleted successfully")
} catch (err) {
- notifications.error(`Error deleting app: ${err}`)
+ notifications.error("Error deleting app")
}
selectedApp = null
appName = null
@@ -246,15 +222,11 @@
const releaseLock = async app => {
try {
- const response = await del(`/api/dev/${app.devId}/lock`)
- if (response.status !== 200) {
- const json = await response.json()
- throw json.message
- }
+ await API.releaseAppLock(app.devId)
await apps.load()
notifications.success("Lock released successfully")
} catch (err) {
- notifications.error(`Error releasing lock: ${err}`)
+ notifications.error("Error releasing lock")
}
}
@@ -272,17 +244,23 @@
}
onMount(async () => {
- await apps.load()
- await templates.load()
- if ($templates?.length === 0) {
- notifications.error("There was a problem loading quick start templates.")
- }
- // if the portal is loaded from an external URL with a template param
- const initInfo = await auth.getInitInfo()
- if (initInfo?.init_template) {
- creatingFromTemplate = true
- createAppFromTemplateUrl(initInfo.init_template)
- return
+ try {
+ await apps.load()
+ await templates.load()
+ if ($templates?.length === 0) {
+ notifications.error(
+ "There was a problem loading quick start templates."
+ )
+ }
+ // If the portal is loaded from an external URL with a template param
+ const initInfo = await auth.getInitInfo()
+ if (initInfo?.init_template) {
+ creatingFromTemplate = true
+ createAppFromTemplateUrl(initInfo.init_template)
+ return
+ }
+ } catch (error) {
+ notifications.error("Error loading apps and templates")
}
loaded = true
})
diff --git a/packages/builder/src/pages/builder/portal/manage/auth/index.svelte b/packages/builder/src/pages/builder/portal/manage/auth/index.svelte
index 20d30fdfbb..b001f02fe9 100644
--- a/packages/builder/src/pages/builder/portal/manage/auth/index.svelte
+++ b/packages/builder/src/pages/builder/portal/manage/auth/index.svelte
@@ -20,9 +20,9 @@
Toggle,
} from "@budibase/bbui"
import { onMount } from "svelte"
- import api from "builderStore/api"
+ import { API } from "api"
import { organisation, admin } from "stores/portal"
- import { uuid } from "builderStore/uuid"
+ import { Helpers } from "@budibase/bbui"
import analytics, { Events } from "analytics"
const ConfigTypes = {
@@ -137,17 +137,6 @@
providers.oidc?.config?.configs[0].clientID &&
providers.oidc?.config?.configs[0].clientSecret
- async function uploadLogo(file) {
- let data = new FormData()
- data.append("file", file)
- const res = await api.post(
- `/api/global/configs/upload/logos_oidc/${file.name}`,
- data,
- {}
- )
- return await res.json()
- }
-
const onFileSelected = e => {
let fileName = e.target.files[0].name
image = e.target.files[0]
@@ -156,17 +145,28 @@
}
async function save(docs) {
- // only if the user has provided an image, upload it.
- image && uploadLogo(image)
let calls = []
+
+ // Only if the user has provided an image, upload it
+ if (image) {
+ let data = new FormData()
+ data.append("file", image)
+ calls.push(
+ API.uploadOIDCLogo({
+ name: image.name,
+ data,
+ })
+ )
+ }
+
docs.forEach(element => {
if (element.type === ConfigTypes.OIDC) {
- //Add a UUID here so each config is distinguishable when it arrives at the login page
+ // Add a UUID here so each config is distinguishable when it arrives at the login page
for (let config of element.config.configs) {
if (!config.uuid) {
- config.uuid = uuid()
+ config.uuid = Helpers.uuid()
}
- // callback urls shouldn't be included
+ // Callback urls shouldn't be included
delete config.callbackURL
}
if (partialOidc) {
@@ -175,8 +175,8 @@
`Please fill in all required ${ConfigTypes.OIDC} fields`
)
} else {
- calls.push(api.post(`/api/global/configs`, element))
- // turn the save button grey when clicked
+ calls.push(API.saveConfig(element))
+ // Turn the save button grey when clicked
oidcSaveButtonDisabled = true
originalOidcDoc = cloneDeep(providers.oidc)
}
@@ -189,71 +189,73 @@
`Please fill in all required ${ConfigTypes.Google} fields`
)
} else {
- calls.push(api.post(`/api/global/configs`, element))
+ calls.push(API.saveConfig(element))
googleSaveButtonDisabled = true
originalGoogleDoc = cloneDeep(providers.google)
}
}
}
})
- calls.length &&
+
+ if (calls.length) {
Promise.all(calls)
- .then(responses => {
- return Promise.all(
- responses.map(response => {
- return response.json()
- })
- )
- })
.then(data => {
data.forEach(res => {
providers[res.type]._rev = res._rev
providers[res.type]._id = res._id
})
- notifications.success(`Settings saved.`)
+ notifications.success(`Settings saved`)
analytics.captureEvent(Events.SSO.SAVED)
})
- .catch(err => {
- notifications.error(`Failed to update auth settings. ${err}`)
- throw new Error(err.message)
+ .catch(() => {
+ notifications.error("Failed to update auth settings")
})
+ }
}
onMount(async () => {
- await organisation.init()
- // fetch the configs for oauth
- const googleResponse = await api.get(
- `/api/global/configs/${ConfigTypes.Google}`
- )
- const googleDoc = await googleResponse.json()
+ try {
+ await organisation.init()
+ } catch (error) {
+ notifications.error("Error getting org config")
+ }
- if (!googleDoc._id) {
+ // Fetch Google config
+ let googleDoc
+ try {
+ googleDoc = await API.getConfig(ConfigTypes.Google)
+ } catch (error) {
+ notifications.error("Error fetching Google OAuth config")
+ }
+ if (!googleDoc?._id) {
providers.google = {
type: ConfigTypes.Google,
config: { activated: true },
}
originalGoogleDoc = cloneDeep(googleDoc)
} else {
- // default activated to true for older configs
+ // Default activated to true for older configs
if (googleDoc.config.activated === undefined) {
googleDoc.config.activated = true
}
originalGoogleDoc = cloneDeep(googleDoc)
providers.google = googleDoc
}
-
googleCallbackUrl = providers?.google?.config?.callbackURL
- //Get the list of user uploaded logos and push it to the dropdown options.
- //This needs to be done before the config call so they're available when the dropdown renders
- const res = await api.get(`/api/global/configs/logos_oidc`)
- const configSettings = await res.json()
-
- if (configSettings.config) {
- const logoKeys = Object.keys(configSettings.config)
-
+ // Get the list of user uploaded logos and push it to the dropdown options.
+ // This needs to be done before the config call so they're available when
+ // the dropdown renders.
+ let oidcLogos
+ try {
+ oidcLogos = await API.getOIDCLogos()
+ } catch (error) {
+ notifications.error("Error fetching OIDC logos")
+ }
+ if (oidcLogos?.config) {
+ const logoKeys = Object.keys(oidcLogos.config)
logoKeys.map(logoKey => {
- const logoUrl = configSettings.config[logoKey]
+ const logoUrl = oidcLogos.config[logoKey]
iconDropdownOptions.unshift({
label: logoKey,
value: logoKey,
@@ -261,11 +263,15 @@
})
})
}
- const oidcResponse = await api.get(
- `/api/global/configs/${ConfigTypes.OIDC}`
- )
- const oidcDoc = await oidcResponse.json()
- if (!oidcDoc._id) {
+
+ // Fetch OIDC config
+ let oidcDoc
+ try {
+ oidcDoc = await API.getConfig(ConfigTypes.OIDC)
+ } catch (error) {
+ notifications.error("Error fetching OIDC config")
+ }
+ if (!oidcDoc?._id) {
providers.oidc = {
type: ConfigTypes.OIDC,
config: { configs: [{ activated: true }] },
diff --git a/packages/builder/src/pages/builder/portal/manage/email/[template].svelte b/packages/builder/src/pages/builder/portal/manage/email/[template].svelte
index cc00f3d798..33ecca2a10 100644
--- a/packages/builder/src/pages/builder/portal/manage/email/[template].svelte
+++ b/packages/builder/src/pages/builder/portal/manage/email/[template].svelte
@@ -36,9 +36,9 @@
try {
// Save your template config
await email.templates.save(selectedTemplate)
- notifications.success(`Template saved.`)
- } catch (err) {
- notifications.error(`Failed to update template settings. ${err}`)
+ notifications.success("Template saved")
+ } catch (error) {
+ notifications.error("Failed to update template settings")
}
}
diff --git a/packages/builder/src/pages/builder/portal/manage/email/_layout.svelte b/packages/builder/src/pages/builder/portal/manage/email/_layout.svelte
index 410a7d4ff2..e371c2daae 100644
--- a/packages/builder/src/pages/builder/portal/manage/email/_layout.svelte
+++ b/packages/builder/src/pages/builder/portal/manage/email/_layout.svelte
@@ -1,7 +1,15 @@
diff --git a/packages/builder/src/pages/builder/portal/manage/email/index.svelte b/packages/builder/src/pages/builder/portal/manage/email/index.svelte
index 5a78623b81..4ef59d2daa 100644
--- a/packages/builder/src/pages/builder/portal/manage/email/index.svelte
+++ b/packages/builder/src/pages/builder/portal/manage/email/index.svelte
@@ -14,7 +14,7 @@
Checkbox,
} from "@budibase/bbui"
import { email } from "stores/portal"
- import api from "builderStore/api"
+ import { API } from "api"
import { cloneDeep } from "lodash/fp"
import analytics, { Events } from "analytics"
@@ -54,55 +54,48 @@
delete smtp.config.auth
}
// Save your SMTP config
- const response = await api.post(`/api/global/configs`, smtp)
-
- if (response.status !== 200) {
- const error = await response.text()
- let message
- try {
- message = JSON.parse(error).message
- } catch (err) {
- message = error
- }
- notifications.error(`Failed to save email settings, reason: ${message}`)
- } else {
- const json = await response.json()
- smtpConfig._rev = json._rev
- smtpConfig._id = json._id
- notifications.success(`Settings saved.`)
+ try {
+ const savedConfig = await API.saveConfig(smtp)
+ smtpConfig._rev = savedConfig._rev
+ smtpConfig._id = savedConfig._id
+ notifications.success(`Settings saved`)
analytics.captureEvent(Events.SMTP.SAVED)
+ } catch (error) {
+ notifications.error(
+ `Failed to save email settings, reason: ${error?.message || "Unknown"}`
+ )
}
}
async function fetchSmtp() {
loading = true
- // fetch the configs for smtp
- const smtpResponse = await api.get(
- `/api/global/configs/${ConfigTypes.SMTP}`
- )
- const smtpDoc = await smtpResponse.json()
-
- if (!smtpDoc._id) {
- smtpConfig = {
- type: ConfigTypes.SMTP,
- config: {
- secure: true,
- },
+ try {
+ // Fetch the configs for smtp
+ const smtpDoc = await API.getConfig(ConfigTypes.SMTP)
+ if (!smtpDoc._id) {
+ smtpConfig = {
+ type: ConfigTypes.SMTP,
+ config: {
+ secure: true,
+ },
+ }
+ } else {
+ smtpConfig = smtpDoc
}
- } else {
- smtpConfig = smtpDoc
- }
- loading = false
- requireAuth = smtpConfig.config.auth != null
- // always attach the auth for the forms purpose -
- // this will be removed later if required
- if (!smtpDoc.config) {
- smtpDoc.config = {}
- }
- if (!smtpDoc.config.auth) {
- smtpConfig.config.auth = {
- type: "login",
+ loading = false
+ requireAuth = smtpConfig.config.auth != null
+ // Always attach the auth for the forms purpose -
+ // this will be removed later if required
+ if (!smtpDoc.config) {
+ smtpDoc.config = {}
}
+ if (!smtpDoc.config.auth) {
+ smtpConfig.config.auth = {
+ type: "login",
+ }
+ }
+ } catch (error) {
+ notifications.error("Error fetching SMTP config")
}
}
diff --git a/packages/builder/src/pages/builder/portal/manage/users/[userId].svelte b/packages/builder/src/pages/builder/portal/manage/users/[userId].svelte
index 549d0e4334..a8cb340465 100644
--- a/packages/builder/src/pages/builder/portal/manage/users/[userId].svelte
+++ b/packages/builder/src/pages/builder/portal/manage/users/[userId].svelte
@@ -64,31 +64,43 @@
const apps = fetchData(`/api/global/roles`)
async function deleteUser() {
- const res = await users.delete(userId)
- if (res.status === 200) {
+ try {
+ await users.delete(userId)
notifications.success(`User ${$userFetch?.data?.email} deleted.`)
$goto("./")
- } else {
- notifications.error(res?.message ? res.message : "Failed to delete user.")
+ } catch (error) {
+ notifications.error("Error deleting user")
}
}
let toggleDisabled = false
async function updateUserFirstName(evt) {
- await users.save({ ...$userFetch?.data, firstName: evt.target.value })
- await userFetch.refresh()
+ try {
+ await users.save({ ...$userFetch?.data, firstName: evt.target.value })
+ await userFetch.refresh()
+ } catch (error) {
+ notifications.error("Error updating user")
+ }
}
async function updateUserLastName(evt) {
- await users.save({ ...$userFetch?.data, lastName: evt.target.value })
- await userFetch.refresh()
+ try {
+ await users.save({ ...$userFetch?.data, lastName: evt.target.value })
+ await userFetch.refresh()
+ } catch (error) {
+ notifications.error("Error updating user")
+ }
}
async function toggleFlag(flagName, detail) {
toggleDisabled = true
- await users.save({ ...$userFetch?.data, [flagName]: { global: detail } })
- await userFetch.refresh()
+ try {
+ await users.save({ ...$userFetch?.data, [flagName]: { global: detail } })
+ await userFetch.refresh()
+ } catch (error) {
+ notifications.error("Error updating user")
+ }
toggleDisabled = false
}
diff --git a/packages/builder/src/pages/builder/portal/manage/users/_components/AddUserModal.svelte b/packages/builder/src/pages/builder/portal/manage/users/_components/AddUserModal.svelte
index 25a69af1c8..0255784a7b 100644
--- a/packages/builder/src/pages/builder/portal/manage/users/_components/AddUserModal.svelte
+++ b/packages/builder/src/pages/builder/portal/manage/users/_components/AddUserModal.svelte
@@ -21,12 +21,12 @@
const [email, error, touched] = createValidationStore("", emailValidator)
async function createUserFlow() {
- const res = await users.invite({ email: $email, builder, admin })
- if (res.status) {
- notifications.error(res.message)
- } else {
+ try {
+ const res = await users.invite({ email: $email, builder, admin })
notifications.success(res.message)
analytics.captureEvent(Events.USER.INVITE, { type: selected })
+ } catch (error) {
+ notifications.error("Error inviting user")
}
}
diff --git a/packages/builder/src/pages/builder/portal/manage/users/_components/BasicOnboardingModal.svelte b/packages/builder/src/pages/builder/portal/manage/users/_components/BasicOnboardingModal.svelte
index ff958d542b..29e2d56ed0 100644
--- a/packages/builder/src/pages/builder/portal/manage/users/_components/BasicOnboardingModal.svelte
+++ b/packages/builder/src/pages/builder/portal/manage/users/_components/BasicOnboardingModal.svelte
@@ -16,17 +16,17 @@
admin = false
async function createUser() {
- const res = await users.create({
- email: $email,
- password,
- builder,
- admin,
- forceResetPassword: true,
- })
- if (res.status) {
- notifications.error(res.message)
- } else {
+ try {
+ await users.create({
+ email: $email,
+ password,
+ builder,
+ admin,
+ forceResetPassword: true,
+ })
notifications.success("Successfully created user")
+ } catch (error) {
+ notifications.error("Error creating user")
}
}
diff --git a/packages/builder/src/pages/builder/portal/manage/users/_components/ForceResetPasswordModal.svelte b/packages/builder/src/pages/builder/portal/manage/users/_components/ForceResetPasswordModal.svelte
index 6468498df8..a380f0aa65 100644
--- a/packages/builder/src/pages/builder/portal/manage/users/_components/ForceResetPasswordModal.svelte
+++ b/packages/builder/src/pages/builder/portal/manage/users/_components/ForceResetPasswordModal.svelte
@@ -10,16 +10,16 @@
const password = Math.random().toString(36).substr(2, 20)
async function resetPassword() {
- const res = await users.save({
- ...user,
- password,
- forceResetPassword: true,
- })
- if (res.status) {
- notifications.error(res.message)
- } else {
- notifications.success("Password reset.")
+ try {
+ await users.save({
+ ...user,
+ password,
+ forceResetPassword: true,
+ })
+ notifications.success("Password reset successfully")
dispatch("update")
+ } catch (error) {
+ notifications.error("Error resetting password")
}
}
diff --git a/packages/builder/src/pages/builder/portal/manage/users/_components/UpdateRolesModal.svelte b/packages/builder/src/pages/builder/portal/manage/users/_components/UpdateRolesModal.svelte
index afa4c84f0e..5a60bfdff8 100644
--- a/packages/builder/src/pages/builder/portal/manage/users/_components/UpdateRolesModal.svelte
+++ b/packages/builder/src/pages/builder/portal/manage/users/_components/UpdateRolesModal.svelte
@@ -18,33 +18,31 @@
let selectedRole = user?.roles?.[app?._id]
async function updateUserRoles() {
- let res
- if (selectedRole === NO_ACCESS) {
- // remove the user role
- const filteredRoles = { ...user.roles }
- delete filteredRoles[app?._id]
- res = await users.save({
- ...user,
- roles: {
- ...filteredRoles,
- },
- })
- } else {
- // add the user role
- res = await users.save({
- ...user,
- roles: {
- ...user.roles,
- [app._id]: selectedRole,
- },
- })
- }
-
- if (res.status === 400) {
- notifications.error("Failed to update role")
- } else {
+ try {
+ if (selectedRole === NO_ACCESS) {
+ // Remove the user role
+ const filteredRoles = { ...user.roles }
+ delete filteredRoles[app?._id]
+ await users.save({
+ ...user,
+ roles: {
+ ...filteredRoles,
+ },
+ })
+ } else {
+ // Add the user role
+ await users.save({
+ ...user,
+ roles: {
+ ...user.roles,
+ [app._id]: selectedRole,
+ },
+ })
+ }
notifications.success("Role updated")
dispatch("update")
+ } catch (error) {
+ notifications.error("Failed to update role")
}
}
diff --git a/packages/builder/src/pages/builder/portal/manage/users/index.svelte b/packages/builder/src/pages/builder/portal/manage/users/index.svelte
index 124115a486..61192063cc 100644
--- a/packages/builder/src/pages/builder/portal/manage/users/index.svelte
+++ b/packages/builder/src/pages/builder/portal/manage/users/index.svelte
@@ -11,13 +11,13 @@
Label,
Layout,
Modal,
+ notifications,
} from "@budibase/bbui"
import TagsRenderer from "./_components/TagsTableRenderer.svelte"
import AddUserModal from "./_components/AddUserModal.svelte"
import BasicOnboardingModal from "./_components/BasicOnboardingModal.svelte"
import { users } from "stores/portal"
-
- users.init()
+ import { onMount } from "svelte"
const schema = {
email: {},
@@ -47,6 +47,14 @@
createUserModal.hide()
basicOnboardingModal.show()
}
+
+ onMount(async () => {
+ try {
+ await users.init()
+ } catch (error) {
+ notifications.error("Error getting user list")
+ }
+ })
diff --git a/packages/builder/src/pages/builder/portal/settings/organisation.svelte b/packages/builder/src/pages/builder/portal/settings/organisation.svelte
index 6903854922..7094a0af01 100644
--- a/packages/builder/src/pages/builder/portal/settings/organisation.svelte
+++ b/packages/builder/src/pages/builder/portal/settings/organisation.svelte
@@ -11,7 +11,7 @@
notifications,
} from "@budibase/bbui"
import { auth, organisation, admin } from "stores/portal"
- import { post } from "builderStore/api"
+ import { API } from "api"
import { writable } from "svelte/store"
import { redirect } from "@roxi/routify"
@@ -32,42 +32,40 @@
let loading = false
async function uploadLogo(file) {
- let data = new FormData()
- data.append("file", file)
- const res = await post(
- "/api/global/configs/upload/settings/logoUrl",
- data,
- {}
- )
- return await res.json()
+ try {
+ let data = new FormData()
+ data.append("file", file)
+ await API.uploadLogo(data)
+ } catch (error) {
+ notifications.error("Error uploading logo")
+ }
}
async function saveConfig() {
loading = true
- // Upload logo if required
- if ($values.logo && !$values.logo.url) {
- await uploadLogo($values.logo)
- await organisation.init()
- }
+ try {
+ // Upload logo if required
+ if ($values.logo && !$values.logo.url) {
+ await uploadLogo($values.logo)
+ await organisation.init()
+ }
- const config = {
- company: $values.company ?? "",
- platformUrl: $values.platformUrl ?? "",
- }
- // remove logo if required
- if (!$values.logo) {
- config.logoUrl = ""
- }
+ const config = {
+ company: $values.company ?? "",
+ platformUrl: $values.platformUrl ?? "",
+ }
- // Update settings
- const res = await organisation.save(config)
- if (res.status === 200) {
- notifications.success("Settings saved successfully")
- } else {
- notifications.error(res.message)
- }
+ // Remove logo if required
+ if (!$values.logo) {
+ config.logoUrl = ""
+ }
+ // Update settings
+ await organisation.save(config)
+ } catch (error) {
+ notifications.error("Error saving org config")
+ }
loading = false
}
diff --git a/packages/builder/src/pages/builder/portal/settings/update.svelte b/packages/builder/src/pages/builder/portal/settings/update.svelte
index 5deb724a7c..d87736144d 100644
--- a/packages/builder/src/pages/builder/portal/settings/update.svelte
+++ b/packages/builder/src/pages/builder/portal/settings/update.svelte
@@ -9,7 +9,7 @@
notifications,
Label,
} from "@budibase/bbui"
- import api from "builderStore/api"
+ import { API } from "api"
import { auth, admin } from "stores/portal"
import { redirect } from "@roxi/routify"
@@ -38,8 +38,12 @@
}
async function getVersion() {
- const response = await api.get("/api/dev/version")
- version = await response.text()
+ try {
+ version = await API.getBudibaseVersion()
+ } catch (error) {
+ notifications.error("Error getting Budibase version")
+ version = null
+ }
}
onMount(() => {
diff --git a/packages/builder/src/pages/index.svelte b/packages/builder/src/pages/index.svelte
index 477097f726..c6eaba8ff1 100644
--- a/packages/builder/src/pages/index.svelte
+++ b/packages/builder/src/pages/index.svelte
@@ -2,10 +2,14 @@
import { redirect } from "@roxi/routify"
import { auth } from "../stores/portal"
import { onMount } from "svelte"
+ import { notifications } from "@budibase/bbui"
- auth.checkQueryString()
-
- onMount(() => {
+ onMount(async () => {
+ try {
+ await auth.checkQueryString()
+ } catch (error) {
+ notifications.error("Error setting org")
+ }
$redirect(`./builder`)
})
diff --git a/packages/builder/src/stores/backend/datasources.js b/packages/builder/src/stores/backend/datasources.js
index 7810c3a950..2423394c6a 100644
--- a/packages/builder/src/stores/backend/datasources.js
+++ b/packages/builder/src/stores/backend/datasources.js
@@ -1,6 +1,6 @@
import { writable, get } from "svelte/store"
import { queries, tables, views } from "./"
-import api from "../../builderStore/api"
+import { API } from "api"
export const INITIAL_DATASOURCE_VALUES = {
list: [],
@@ -13,23 +13,20 @@ export function createDatasourcesStore() {
const { subscribe, update, set } = store
async function updateDatasource(response) {
- if (response.status !== 200) {
- throw new Error(await response.text())
- }
-
- const { datasource, error } = await response.json()
+ const { datasource, error } = response
update(state => {
const currentIdx = state.list.findIndex(ds => ds._id === datasource._id)
-
const sources = state.list
-
if (currentIdx >= 0) {
sources.splice(currentIdx, 1, datasource)
} else {
sources.push(datasource)
}
-
- return { list: sources, selected: datasource._id, schemaError: error }
+ return {
+ list: sources,
+ selected: datasource._id,
+ schemaError: error,
+ }
})
return datasource
}
@@ -38,25 +35,25 @@ export function createDatasourcesStore() {
subscribe,
update,
init: async () => {
- const response = await api.get(`/api/datasources`)
- const json = await response.json()
- set({ list: json, selected: null })
+ const datasources = await API.getDatasources()
+ set({
+ list: datasources,
+ selected: null,
+ })
},
fetch: async () => {
- const response = await api.get(`/api/datasources`)
- const json = await response.json()
+ const datasources = await API.getDatasources()
// Clear selected if it no longer exists, otherwise keep it
const selected = get(store).selected
let nextSelected = null
- if (selected && json.find(source => source._id === selected)) {
+ if (selected && datasources.find(source => source._id === selected)) {
nextSelected = selected
}
- update(state => ({ ...state, list: json, selected: nextSelected }))
- return json
+ update(state => ({ ...state, list: datasources, selected: nextSelected }))
},
- select: async datasourceId => {
+ select: datasourceId => {
update(state => ({ ...state, selected: datasourceId }))
queries.unselect()
tables.unselect()
@@ -66,37 +63,33 @@ export function createDatasourcesStore() {
update(state => ({ ...state, selected: null }))
},
updateSchema: async datasource => {
- let url = `/api/datasources/${datasource._id}/schema`
-
- const response = await api.post(url)
- return updateDatasource(response)
+ const response = await API.buildDatasourceSchema(datasource?._id)
+ return await updateDatasource(response)
},
save: async (body, fetchSchema = false) => {
let response
if (body._id) {
- response = await api.put(`/api/datasources/${body._id}`, body)
+ response = await API.updateDatasource(body)
} else {
- response = await api.post("/api/datasources", {
+ response = await API.createDatasource({
datasource: body,
fetchSchema,
})
}
-
return updateDatasource(response)
},
delete: async datasource => {
- const response = await api.delete(
- `/api/datasources/${datasource._id}/${datasource._rev}`
- )
+ await API.deleteDatasource({
+ datasourceId: datasource?._id,
+ datasourceRev: datasource?._rev,
+ })
update(state => {
const sources = state.list.filter(
existing => existing._id !== datasource._id
)
return { list: sources, selected: null }
})
-
await queries.fetch()
- return response
},
removeSchemaError: () => {
update(state => {
diff --git a/packages/builder/src/stores/backend/flags.js b/packages/builder/src/stores/backend/flags.js
index 7e5adcd00f..449d010640 100644
--- a/packages/builder/src/stores/backend/flags.js
+++ b/packages/builder/src/stores/backend/flags.js
@@ -1,37 +1,27 @@
import { writable } from "svelte/store"
-import api from "builderStore/api"
+import { API } from "api"
export function createFlagsStore() {
const { subscribe, set } = writable({})
- return {
- subscribe,
+ const actions = {
fetch: async () => {
- const { doc, response } = await getFlags()
- set(doc)
- return response
+ const flags = await API.getFlags()
+ set(flags)
},
updateFlag: async (flag, value) => {
- const response = await api.post("/api/users/flags", {
+ await API.updateFlag({
flag,
value,
})
- if (response.status === 200) {
- const { doc } = await getFlags()
- set(doc)
- }
- return response
+ await actions.fetch()
},
}
-}
-async function getFlags() {
- const response = await api.get("/api/users/flags")
- let doc = {}
- if (response.status === 200) {
- doc = await response.json()
+ return {
+ subscribe,
+ ...actions,
}
- return { doc, response }
}
export const flags = createFlagsStore()
diff --git a/packages/builder/src/stores/backend/integrations.js b/packages/builder/src/stores/backend/integrations.js
index d1df818248..717b656c72 100644
--- a/packages/builder/src/stores/backend/integrations.js
+++ b/packages/builder/src/stores/backend/integrations.js
@@ -1,3 +1,16 @@
import { writable } from "svelte/store"
+import { API } from "api"
-export const integrations = writable({})
+const createIntegrationsStore = () => {
+ const store = writable(null)
+
+ return {
+ ...store,
+ init: async () => {
+ const integrations = await API.getIntegrations()
+ store.set(integrations)
+ },
+ }
+}
+
+export const integrations = createIntegrationsStore()
diff --git a/packages/builder/src/stores/backend/permissions.js b/packages/builder/src/stores/backend/permissions.js
index 29159494ed..aaab406bc9 100644
--- a/packages/builder/src/stores/backend/permissions.js
+++ b/packages/builder/src/stores/backend/permissions.js
@@ -1,5 +1,5 @@
import { writable } from "svelte/store"
-import api from "builderStore/api"
+import { API } from "api"
export function createPermissionStore() {
const { subscribe } = writable([])
@@ -7,14 +7,14 @@ export function createPermissionStore() {
return {
subscribe,
save: async ({ level, role, resource }) => {
- const response = await api.post(
- `/api/permission/${role}/${resource}/${level}`
- )
- return await response.json()
+ return await API.updatePermissionForResource({
+ resourceId: resource,
+ roleId: role,
+ level,
+ })
},
forResource: async resourceId => {
- const response = await api.get(`/api/permission/${resourceId}`)
- return await response.json()
+ return await API.getPermissionForResource(resourceId)
},
}
}
diff --git a/packages/builder/src/stores/backend/queries.js b/packages/builder/src/stores/backend/queries.js
index 2018933ffc..6e30cb21f8 100644
--- a/packages/builder/src/stores/backend/queries.js
+++ b/packages/builder/src/stores/backend/queries.js
@@ -1,6 +1,6 @@
import { writable, get } from "svelte/store"
import { datasources, integrations, tables, views } from "./"
-import api from "builderStore/api"
+import { API } from "api"
import { duplicateName } from "../../helpers/duplicate"
const sortQueries = queryList => {
@@ -15,23 +15,26 @@ export function createQueriesStore() {
const actions = {
init: async () => {
- const response = await api.get(`/api/queries`)
- const json = await response.json()
- set({ list: json, selected: null })
+ const queries = await API.getQueries()
+ set({
+ list: queries,
+ selected: null,
+ })
},
fetch: async () => {
- const response = await api.get(`/api/queries`)
- const json = await response.json()
- sortQueries(json)
- update(state => ({ ...state, list: json }))
- return json
+ const queries = await API.getQueries()
+ sortQueries(queries)
+ update(state => ({
+ ...state,
+ list: queries,
+ }))
},
save: async (datasourceId, query) => {
const _integrations = get(integrations)
const dataSource = get(datasources).list.filter(
ds => ds._id === datasourceId
)
- // check if readable attribute is found
+ // Check if readable attribute is found
if (dataSource.length !== 0) {
const integration = _integrations[dataSource[0].source]
const readable = integration.query[query.queryVerb].readable
@@ -40,34 +43,28 @@ export function createQueriesStore() {
}
}
query.datasourceId = datasourceId
- const response = await api.post(`/api/queries`, query)
- if (response.status !== 200) {
- throw new Error("Failed saving query.")
- }
- const json = await response.json()
+ const savedQuery = await API.saveQuery(query)
update(state => {
- const currentIdx = state.list.findIndex(query => query._id === json._id)
-
+ const idx = state.list.findIndex(query => query._id === savedQuery._id)
const queries = state.list
-
- if (currentIdx >= 0) {
- queries.splice(currentIdx, 1, json)
+ if (idx >= 0) {
+ queries.splice(idx, 1, savedQuery)
} else {
- queries.push(json)
+ queries.push(savedQuery)
}
sortQueries(queries)
- return { list: queries, selected: json._id }
+ return {
+ list: queries,
+ selected: savedQuery._id,
+ }
})
- return json
+ return savedQuery
},
- import: async body => {
- const response = await api.post(`/api/queries/import`, body)
-
- if (response.status !== 200) {
- throw new Error(response.message)
- }
-
- return response.json()
+ import: async (data, datasourceId) => {
+ return await API.importQueries({
+ datasourceId,
+ data,
+ })
},
select: query => {
update(state => ({ ...state, selected: query._id }))
@@ -79,48 +76,37 @@ export function createQueriesStore() {
update(state => ({ ...state, selected: null }))
},
preview: async query => {
- const response = await api.post("/api/queries/preview", {
- fields: query.fields,
- queryVerb: query.queryVerb,
- transformer: query.transformer,
- parameters: query.parameters.reduce(
- (acc, next) => ({
- ...acc,
- [next.name]: next.default,
- }),
- {}
- ),
- datasourceId: query.datasourceId,
- queryId: query._id || undefined,
+ const parameters = query.parameters.reduce(
+ (acc, next) => ({
+ ...acc,
+ [next.name]: next.default,
+ }),
+ {}
+ )
+ const result = await API.previewQuery({
+ ...query,
+ parameters,
})
-
- if (response.status !== 200) {
- const error = await response.text()
- throw `Query error: ${error}`
- }
-
- const json = await response.json()
// Assume all the fields are strings and create a basic schema from the
// unique fields returned by the server
const schema = {}
- for (let field of json.schemaFields) {
+ for (let field of result.schemaFields) {
schema[field] = "string"
}
- return { ...json, schema, rows: json.rows || [] }
+ return { ...result, schema, rows: result.rows || [] }
},
delete: async query => {
- const response = await api.delete(
- `/api/queries/${query._id}/${query._rev}`
- )
+ await API.deleteQuery({
+ queryId: query?._id,
+ queryRev: query?._rev,
+ })
update(state => {
state.list = state.list.filter(existing => existing._id !== query._id)
if (state.selected === query._id) {
state.selected = null
}
-
return state
})
- return response
},
duplicate: async query => {
let list = get(store).list
diff --git a/packages/builder/src/stores/backend/roles.js b/packages/builder/src/stores/backend/roles.js
index 1a1a9c04c5..0c3cdbce5a 100644
--- a/packages/builder/src/stores/backend/roles.js
+++ b/packages/builder/src/stores/backend/roles.js
@@ -1,30 +1,32 @@
import { writable } from "svelte/store"
-import api from "builderStore/api"
+import { API } from "api"
export function createRolesStore() {
const { subscribe, update, set } = writable([])
- return {
- subscribe,
+ const actions = {
fetch: async () => {
- set(await getRoles())
+ const roles = await API.getRoles()
+ set(roles)
},
delete: async role => {
- const response = await api.delete(`/api/roles/${role._id}/${role._rev}`)
+ await API.deleteRole({
+ roleId: role?._id,
+ roleRev: role?._rev,
+ })
update(state => state.filter(existing => existing._id !== role._id))
- return response
},
save: async role => {
- const response = await api.post("/api/roles", role)
- set(await getRoles())
- return response
+ const savedRole = await API.saveRole(role)
+ await actions.fetch()
+ return savedRole
},
}
+
+ return {
+ subscribe,
+ ...actions,
+ }
}
-async function getRoles() {
- const response = await api.get("/api/roles")
- return await response.json()
-}
-
export const roles = createRolesStore()
diff --git a/packages/builder/src/stores/backend/tables.js b/packages/builder/src/stores/backend/tables.js
index 02db48c549..f6d20037cb 100644
--- a/packages/builder/src/stores/backend/tables.js
+++ b/packages/builder/src/stores/backend/tables.js
@@ -1,7 +1,7 @@
import { get, writable } from "svelte/store"
import { datasources, queries, views } from "./"
import { cloneDeep } from "lodash/fp"
-import api from "builderStore/api"
+import { API } from "api"
import { SWITCHABLE_TYPES } from "../../constants/backend"
export function createTablesStore() {
@@ -9,10 +9,11 @@ export function createTablesStore() {
const { subscribe, update, set } = store
async function fetch() {
- const tablesResponse = await api.get(`/api/tables`)
- const tables = await tablesResponse.json()
- update(state => ({ ...state, list: tables }))
- return tables
+ const tables = await API.getTables()
+ update(state => ({
+ ...state,
+ list: tables,
+ }))
}
async function select(table) {
@@ -38,16 +39,16 @@ export function createTablesStore() {
const oldTable = get(store).list.filter(t => t._id === table._id)[0]
const fieldNames = []
- // update any renamed schema keys to reflect their names
+ // Update any renamed schema keys to reflect their names
for (let key of Object.keys(updatedTable.schema)) {
- // if field name has been seen before remove it
+ // If field name has been seen before remove it
if (fieldNames.indexOf(key.toLowerCase()) !== -1) {
delete updatedTable.schema[key]
continue
}
const field = updatedTable.schema[key]
const oldField = oldTable?.schema[key]
- // if the type has changed then revert back to the old field
+ // If the type has changed then revert back to the old field
if (
oldField != null &&
oldField?.type !== field.type &&
@@ -55,21 +56,17 @@ export function createTablesStore() {
) {
updatedTable.schema[key] = oldField
}
- // field has been renamed
+ // Field has been renamed
if (field.name && field.name !== key) {
updatedTable.schema[field.name] = field
updatedTable._rename = { old: key, updated: field.name }
delete updatedTable.schema[key]
}
- // finally record this field has been used
+ // Finally record this field has been used
fieldNames.push(key.toLowerCase())
}
- const response = await api.post(`/api/tables`, updatedTable)
- if (response.status !== 200) {
- throw (await response.json()).message
- }
- const savedTable = await response.json()
+ const savedTable = await API.saveTable(updatedTable)
await fetch()
if (table.type === "external") {
await datasources.fetch()
@@ -91,21 +88,18 @@ export function createTablesStore() {
},
save,
init: async () => {
- const response = await api.get("/api/tables")
- const json = await response.json()
+ const tables = await API.getTables()
set({
- list: json,
+ list: tables,
selected: {},
draft: {},
})
},
delete: async table => {
- const response = await api.delete(
- `/api/tables/${table._id}/${table._rev}`
- )
- if (response.status !== 200) {
- throw (await response.json()).message
- }
+ await API.deleteTable({
+ tableId: table?._id,
+ tableRev: table?._rev,
+ })
update(state => ({
...state,
list: state.list.filter(existing => existing._id !== table._id),
@@ -156,12 +150,16 @@ export function createTablesStore() {
await promise
}
},
- deleteField: field => {
+ deleteField: async field => {
+ let promise
update(state => {
delete state.draft.schema[field.name]
- save(state.draft)
+ promise = save(state.draft)
return state
})
+ if (promise) {
+ await promise
+ }
},
}
}
diff --git a/packages/builder/src/stores/backend/views.js b/packages/builder/src/stores/backend/views.js
index 14c7bf92a4..849a66f671 100644
--- a/packages/builder/src/stores/backend/views.js
+++ b/packages/builder/src/stores/backend/views.js
@@ -1,6 +1,6 @@
import { writable, get } from "svelte/store"
import { tables, datasources, queries } from "./"
-import api from "builderStore/api"
+import { API } from "api"
export function createViewsStore() {
const { subscribe, update } = writable({
@@ -11,7 +11,7 @@ export function createViewsStore() {
return {
subscribe,
update,
- select: async view => {
+ select: view => {
update(state => ({
...state,
selected: view,
@@ -27,16 +27,14 @@ export function createViewsStore() {
}))
},
delete: async view => {
- await api.delete(`/api/views/${view}`)
+ await API.deleteView(view)
await tables.fetch()
},
save: async view => {
- const response = await api.post(`/api/views`, view)
- const json = await response.json()
-
+ const savedView = await API.saveView(view)
const viewMeta = {
name: view.name,
- ...json,
+ ...savedView,
}
const viewTable = get(tables).list.find(
diff --git a/packages/builder/src/stores/portal/admin.js b/packages/builder/src/stores/portal/admin.js
index d98eae8363..dc68c43cc5 100644
--- a/packages/builder/src/stores/portal/admin.js
+++ b/packages/builder/src/stores/portal/admin.js
@@ -1,5 +1,5 @@
import { writable, get } from "svelte/store"
-import api from "builderStore/api"
+import { API } from "api"
import { auth } from "stores/portal"
export function createAdminStore() {
@@ -23,64 +23,37 @@ export function createAdminStore() {
const admin = writable(DEFAULT_CONFIG)
async function init() {
- try {
- const tenantId = get(auth).tenantId
- const response = await api.get(
- `/api/global/configs/checklist?tenantId=${tenantId}`
- )
- const json = await response.json()
- const totalSteps = Object.keys(json).length
- const completedSteps = Object.values(json).filter(x => x?.checked).length
-
- await getEnvironment()
- admin.update(store => {
- store.loaded = true
- store.checklist = json
- store.onboardingProgress = (completedSteps / totalSteps) * 100
- return store
- })
- } catch (err) {
- admin.update(store => {
- store.checklist = null
- return store
- })
- }
+ const tenantId = get(auth).tenantId
+ const checklist = await API.getChecklist(tenantId)
+ const totalSteps = Object.keys(checklist).length
+ const completedSteps = Object.values(checklist).filter(
+ x => x?.checked
+ ).length
+ await getEnvironment()
+ admin.update(store => {
+ store.loaded = true
+ store.checklist = checklist
+ store.onboardingProgress = (completedSteps / totalSteps) * 100
+ return store
+ })
}
async function checkImportComplete() {
- const response = await api.get(`/api/cloud/import/complete`)
- if (response.status === 200) {
- const json = await response.json()
- admin.update(store => {
- store.importComplete = json ? json.imported : false
- return store
- })
- }
+ const result = await API.checkImportComplete()
+ admin.update(store => {
+ store.importComplete = result ? result.imported : false
+ return store
+ })
}
async function getEnvironment() {
- let multiTenancyEnabled = false
- let cloud = false
- let disableAccountPortal = false
- let accountPortalUrl = ""
- let isDev = false
- try {
- const response = await api.get(`/api/system/environment`)
- const json = await response.json()
- multiTenancyEnabled = json.multiTenancy
- cloud = json.cloud
- disableAccountPortal = json.disableAccountPortal
- accountPortalUrl = json.accountPortalUrl
- isDev = json.isDev
- } catch (err) {
- // just let it stay disabled
- }
+ const environment = await API.getEnvironment()
admin.update(store => {
- store.multiTenancy = multiTenancyEnabled
- store.cloud = cloud
- store.disableAccountPortal = disableAccountPortal
- store.accountPortalUrl = accountPortalUrl
- store.isDev = isDev
+ store.multiTenancy = environment.multiTenancy
+ store.cloud = environment.cloud
+ store.disableAccountPortal = environment.disableAccountPortal
+ store.accountPortalUrl = environment.accountPortalUrl
+ store.isDev = environment.isDev
return store
})
}
diff --git a/packages/builder/src/stores/portal/apps.js b/packages/builder/src/stores/portal/apps.js
index de944c057d..b8fb8c5670 100644
--- a/packages/builder/src/stores/portal/apps.js
+++ b/packages/builder/src/stores/portal/apps.js
@@ -1,7 +1,6 @@
import { writable } from "svelte/store"
-import { get } from "builderStore/api"
import { AppStatus } from "../../constants"
-import api from "../../builderStore/api"
+import { API } from "api"
const extractAppId = id => {
const split = id?.split("_") || []
@@ -12,77 +11,67 @@ export function createAppStore() {
const store = writable([])
async function load() {
- try {
- const res = await get(`/api/applications?status=all`)
- const json = await res.json()
- if (res.ok && Array.isArray(json)) {
- // Merge apps into one sensible list
- let appMap = {}
- let devApps = json.filter(app => app.status === AppStatus.DEV)
- let deployedApps = json.filter(app => app.status === AppStatus.DEPLOYED)
+ const json = await API.getApps()
+ if (Array.isArray(json)) {
+ // Merge apps into one sensible list
+ let appMap = {}
+ let devApps = json.filter(app => app.status === AppStatus.DEV)
+ let deployedApps = json.filter(app => app.status === AppStatus.DEPLOYED)
- // First append all dev app version
- devApps.forEach(app => {
- const id = extractAppId(app.appId)
- appMap[id] = {
- ...app,
- devId: app.appId,
- devRev: app._rev,
- }
- })
+ // First append all dev app version
+ devApps.forEach(app => {
+ const id = extractAppId(app.appId)
+ appMap[id] = {
+ ...app,
+ devId: app.appId,
+ devRev: app._rev,
+ }
+ })
- // Then merge with all prod app versions
- deployedApps.forEach(app => {
- const id = extractAppId(app.appId)
+ // Then merge with all prod app versions
+ deployedApps.forEach(app => {
+ const id = extractAppId(app.appId)
- // Skip any deployed apps which don't have a dev counterpart
- if (!appMap[id]) {
- return
- }
+ // Skip any deployed apps which don't have a dev counterpart
+ if (!appMap[id]) {
+ return
+ }
- appMap[id] = {
- ...appMap[id],
- ...app,
- prodId: app.appId,
- prodRev: app._rev,
- }
- })
+ appMap[id] = {
+ ...appMap[id],
+ ...app,
+ prodId: app.appId,
+ prodRev: app._rev,
+ }
+ })
- // Transform into an array and clean up
- const apps = Object.values(appMap)
- apps.forEach(app => {
- app.appId = extractAppId(app.devId)
- delete app._id
- delete app._rev
- })
- store.set(apps)
- } else {
- store.set([])
- }
- return json
- } catch (error) {
+ // Transform into an array and clean up
+ const apps = Object.values(appMap)
+ apps.forEach(app => {
+ app.appId = extractAppId(app.devId)
+ delete app._id
+ delete app._rev
+ })
+ store.set(apps)
+ } else {
store.set([])
}
}
async function update(appId, value) {
- console.log({ value })
- const response = await api.put(`/api/applications/${appId}`, { ...value })
- if (response.status === 200) {
- store.update(state => {
- const updatedAppIndex = state.findIndex(
- app => app.instance._id === appId
- )
- if (updatedAppIndex !== -1) {
- let updatedApp = state[updatedAppIndex]
- updatedApp = { ...updatedApp, ...value }
- state.apps = state.splice(updatedAppIndex, 1, updatedApp)
- }
- return state
- })
- } else {
- throw new Error("Error updating name")
- }
+ await API.saveAppMetadata({
+ appId,
+ metadata: value,
+ })
+ store.update(state => {
+ const updatedAppIndex = state.findIndex(app => app.instance._id === appId)
+ if (updatedAppIndex !== -1) {
+ let updatedApp = state[updatedAppIndex]
+ updatedApp = { ...updatedApp, ...value }
+ state.apps = state.splice(updatedAppIndex, 1, updatedApp)
+ }
+ return state
+ })
}
return {
diff --git a/packages/builder/src/stores/portal/auth.js b/packages/builder/src/stores/portal/auth.js
index c4197a89c0..d66e901163 100644
--- a/packages/builder/src/stores/portal/auth.js
+++ b/packages/builder/src/stores/portal/auth.js
@@ -1,5 +1,5 @@
import { derived, writable, get } from "svelte/store"
-import api from "../../builderStore/api"
+import { API } from "api"
import { admin } from "stores/portal"
import analytics from "analytics"
@@ -54,18 +54,25 @@ export function createAuthStore() {
})
if (user) {
- analytics.activate().then(() => {
- analytics.identify(user._id, user)
- analytics.showChat({
- email: user.email,
- created_at: (user.createdAt || Date.now()) / 1000,
- name: user.account?.name,
- user_id: user._id,
- tenant: user.tenantId,
- "Company size": user.account?.size,
- "Job role": user.account?.profession,
+ analytics
+ .activate()
+ .then(() => {
+ analytics.identify(user._id, user)
+ analytics.showChat({
+ email: user.email,
+ created_at: (user.createdAt || Date.now()) / 1000,
+ name: user.account?.name,
+ user_id: user._id,
+ tenant: user.tenantId,
+ "Company size": user.account?.size,
+ "Job role": user.account?.profession,
+ })
+ })
+ .catch(() => {
+ // This request may fail due to browser extensions blocking requests
+ // containing the word analytics, so we don't want to spam users with
+ // an error here.
})
- })
}
}
@@ -83,7 +90,7 @@ export function createAuthStore() {
}
async function setInitInfo(info) {
- await api.post(`/api/global/auth/init`, info)
+ await API.setInitInfo(info)
auth.update(store => {
store.initInfo = info
return store
@@ -91,7 +98,7 @@ export function createAuthStore() {
return info
}
- async function setPostLogout() {
+ function setPostLogout() {
auth.update(store => {
store.postLogout = true
return store
@@ -99,13 +106,12 @@ export function createAuthStore() {
}
async function getInitInfo() {
- const response = await api.get(`/api/global/auth/init`)
- const json = response.json()
+ const info = await API.getInitInfo()
auth.update(store => {
- store.initInfo = json
+ store.initInfo = info
return store
})
- return json
+ return info
}
const actions = {
@@ -120,76 +126,51 @@ export function createAuthStore() {
await setOrganisation(tenantId)
},
getSelf: async () => {
- const response = await api.get("/api/global/users/self")
- if (response.status !== 200) {
+ // We need to catch this locally as we never want this to fail, even
+ // though normally we never want to swallow API errors at the store level.
+ // We're either logged in or we aren't.
+ // We also need to always update the loaded flag.
+ try {
+ const user = await API.fetchBuilderSelf()
+ setUser(user)
+ } catch (error) {
setUser(null)
- } else {
- const json = await response.json()
- setUser(json)
}
},
login: async creds => {
const tenantId = get(store).tenantId
- const response = await api.post(
- `/api/global/auth/${tenantId}/login`,
- creds
- )
- if (response.status === 200) {
- await actions.getSelf()
- } else {
- const json = await response.json()
- throw new Error(json.message ? json.message : "Invalid credentials")
- }
+ await API.logIn({
+ username: creds.username,
+ password: creds.password,
+ tenantId,
+ })
+ await actions.getSelf()
},
logout: async () => {
- const response = await api.post(`/api/global/auth/logout`)
- if (response.status !== 200) {
- throw "Unable to create logout"
- }
- await response.json()
- await setInitInfo({})
setUser(null)
setPostLogout()
+ await API.logOut()
+ await setInitInfo({})
},
updateSelf: async fields => {
const newUser = { ...get(auth).user, ...fields }
- const response = await api.post("/api/global/users/self", newUser)
- if (response.status === 200) {
- setUser(newUser)
- } else {
- throw "Unable to update user details"
- }
+ await API.updateSelf(newUser)
+ setUser(newUser)
},
forgotPassword: async email => {
const tenantId = get(store).tenantId
- const response = await api.post(`/api/global/auth/${tenantId}/reset`, {
+ await API.requestForgotPassword({
+ tenantId,
email,
})
- if (response.status !== 200) {
- throw "Unable to send email with reset link"
- }
- await response.json()
},
- resetPassword: async (password, code) => {
+ resetPassword: async (password, resetCode) => {
const tenantId = get(store).tenantId
- const response = await api.post(
- `/api/global/auth/${tenantId}/reset/update`,
- {
- password,
- resetCode: code,
- }
- )
- if (response.status !== 200) {
- throw "Unable to reset password"
- }
- await response.json()
- },
- createUser: async user => {
- const response = await api.post(`/api/global/users`, user)
- if (response.status !== 200) {
- throw "Unable to create user"
- }
- await response.json()
+ await API.resetPassword({
+ tenantId,
+ password,
+ resetCode,
+ })
},
}
diff --git a/packages/builder/src/stores/portal/email.js b/packages/builder/src/stores/portal/email.js
index a015480141..2e222d34c4 100644
--- a/packages/builder/src/stores/portal/email.js
+++ b/packages/builder/src/stores/portal/email.js
@@ -1,5 +1,5 @@
import { writable } from "svelte/store"
-import api from "builderStore/api"
+import { API } from "api"
export function createEmailStore() {
const store = writable({})
@@ -8,14 +8,9 @@ export function createEmailStore() {
subscribe: store.subscribe,
templates: {
fetch: async () => {
- // fetch the email template definitions
- const response = await api.get(`/api/global/template/definitions`)
- const definitions = await response.json()
-
- // fetch the email templates themselves
- const templatesResponse = await api.get(`/api/global/template/email`)
- const templates = await templatesResponse.json()
-
+ // Fetch the email template definitions and templates
+ const definitions = await API.getEmailTemplateDefinitions()
+ const templates = await API.getEmailTemplates()
store.set({
definitions,
templates,
@@ -23,15 +18,12 @@ export function createEmailStore() {
},
save: async template => {
// Save your template config
- const response = await api.post(`/api/global/template`, template)
- const json = await response.json()
- if (response.status !== 200) throw new Error(json.message)
- template._rev = json._rev
- template._id = json._id
-
+ const savedTemplate = await API.saveEmailTemplate(template)
+ template._rev = savedTemplate._rev
+ template._id = savedTemplate._id
store.update(state => {
const currentIdx = state.templates.findIndex(
- template => template.purpose === json.purpose
+ template => template.purpose === savedTemplate.purpose
)
state.templates.splice(currentIdx, 1, template)
return state
diff --git a/packages/builder/src/stores/portal/oidc.js b/packages/builder/src/stores/portal/oidc.js
index 3e3a7048ca..3a4b954753 100644
--- a/packages/builder/src/stores/portal/oidc.js
+++ b/packages/builder/src/stores/portal/oidc.js
@@ -1,5 +1,5 @@
import { writable, get } from "svelte/store"
-import api from "builderStore/api"
+import { API } from "api"
import { auth } from "stores/portal"
const OIDC_CONFIG = {
@@ -11,26 +11,20 @@ const OIDC_CONFIG = {
export function createOidcStore() {
const store = writable(OIDC_CONFIG)
const { set, subscribe } = store
-
- async function init() {
- const tenantId = get(auth).tenantId
- const res = await api.get(
- `/api/global/configs/public/oidc?tenantId=${tenantId}`
- )
- const json = await res.json()
-
- if (json.status === 400 || Object.keys(json).length === 0) {
- set(OIDC_CONFIG)
- } else {
- // Just use the first config for now. We will be support multiple logins buttons later on.
- set(...json)
- }
- }
-
return {
subscribe,
set,
- init,
+ init: async () => {
+ const tenantId = get(auth).tenantId
+ const config = await API.getOIDCConfig(tenantId)
+ if (Object.keys(config || {}).length) {
+ // Just use the first config for now.
+ // We will be support multiple logins buttons later on.
+ set(...config)
+ } else {
+ set(OIDC_CONFIG)
+ }
+ },
}
}
diff --git a/packages/builder/src/stores/portal/organisation.js b/packages/builder/src/stores/portal/organisation.js
index 21a110c54a..9709578fa2 100644
--- a/packages/builder/src/stores/portal/organisation.js
+++ b/packages/builder/src/stores/portal/organisation.js
@@ -1,5 +1,5 @@
import { writable, get } from "svelte/store"
-import api from "builderStore/api"
+import { API } from "api"
import { auth } from "stores/portal"
const DEFAULT_CONFIG = {
@@ -19,35 +19,23 @@ export function createOrganisationStore() {
async function init() {
const tenantId = get(auth).tenantId
- const res = await api.get(`/api/global/configs/public?tenantId=${tenantId}`)
- const json = await res.json()
-
- if (json.status === 400) {
- set(DEFAULT_CONFIG)
- } else {
- set({ ...DEFAULT_CONFIG, ...json.config, _rev: json._rev })
- }
+ const tenant = await API.getTenantConfig(tenantId)
+ set({ ...DEFAULT_CONFIG, ...tenant.config, _rev: tenant._rev })
}
async function save(config) {
- // delete non-persisted fields
+ // Delete non-persisted fields
const storeConfig = get(store)
delete storeConfig.oidc
delete storeConfig.google
delete storeConfig.oidcCallbackUrl
delete storeConfig.googleCallbackUrl
-
- const res = await api.post("/api/global/configs", {
+ await API.saveConfig({
type: "settings",
config: { ...get(store), ...config },
_rev: get(store)._rev,
})
- const json = await res.json()
- if (json.status) {
- return json
- }
await init()
- return { status: 200 }
}
return {
diff --git a/packages/builder/src/stores/portal/templates.js b/packages/builder/src/stores/portal/templates.js
index b82ecd70e2..904e9cfa8e 100644
--- a/packages/builder/src/stores/portal/templates.js
+++ b/packages/builder/src/stores/portal/templates.js
@@ -1,18 +1,15 @@
import { writable } from "svelte/store"
-import api from "builderStore/api"
+import { API } from "api"
export function templatesStore() {
const { subscribe, set } = writable([])
- async function load() {
- const response = await api.get("/api/templates?type=app")
- const json = await response.json()
- set(json)
- }
-
return {
subscribe,
- load,
+ load: async () => {
+ const templates = await API.getAppTemplates()
+ set(templates)
+ },
}
}
diff --git a/packages/builder/src/stores/portal/users.js b/packages/builder/src/stores/portal/users.js
index 9a3df120e0..cebf03d4c0 100644
--- a/packages/builder/src/stores/portal/users.js
+++ b/packages/builder/src/stores/portal/users.js
@@ -1,38 +1,28 @@
import { writable } from "svelte/store"
-import api, { post } from "builderStore/api"
+import { API } from "api"
import { update } from "lodash"
export function createUsersStore() {
const { subscribe, set } = writable([])
async function init() {
- const response = await api.get(`/api/global/users`)
- const json = await response.json()
- set(json)
+ const users = await API.getUsers()
+ set(users)
}
async function invite({ email, builder, admin }) {
- const body = { email, userInfo: {} }
- if (admin) {
- body.userInfo.admin = {
- global: true,
- }
- }
- if (builder) {
- body.userInfo.builder = {
- global: true,
- }
- }
- const response = await api.post(`/api/global/users/invite`, body)
- return await response.json()
+ await API.inviteUser({
+ email,
+ builder,
+ admin,
+ })
}
async function acceptInvite(inviteCode, password) {
- const response = await api.post("/api/global/users/invite/accept", {
+ await API.acceptInvite({
inviteCode,
password,
})
- return await response.json()
}
async function create({
@@ -56,29 +46,17 @@ export function createUsersStore() {
if (admin) {
body.admin = { global: true }
}
- const response = await api.post("/api/global/users", body)
+ await API.saveUser(body)
await init()
- return await response.json()
}
async function del(id) {
- const response = await api.delete(`/api/global/users/${id}`)
+ await API.deleteUser(id)
update(users => users.filter(user => user._id !== id))
- const json = await response.json()
- return {
- ...json,
- status: response.status,
- }
}
async function save(data) {
- try {
- const res = await post(`/api/global/users`, data)
- return await res.json()
- } catch (error) {
- console.log(error)
- return error
- }
+ await API.saveUser(data)
}
return {
diff --git a/packages/builder/tsconfig.json b/packages/builder/tsconfig.json
new file mode 100644
index 0000000000..6a5ba315a1
--- /dev/null
+++ b/packages/builder/tsconfig.json
@@ -0,0 +1,23 @@
+{
+ "compilerOptions": {
+ "target": "es6",
+ "module": "commonjs",
+ "lib": ["es2019"],
+ "allowJs": true,
+ "outDir": "dist",
+ "strict": true,
+ "noImplicitAny": true,
+ "esModuleInterop": true,
+ "resolveJsonModule": true,
+ "incremental": true
+ },
+ "include": [
+ "./src/**/*"
+ ],
+ "exclude": [
+ "node_modules",
+ "**/*.json",
+ "**/*.spec.ts",
+ "**/*.spec.js"
+ ]
+}
diff --git a/packages/builder/vite.config.js b/packages/builder/vite.config.js
index d66d677555..b68d265bc5 100644
--- a/packages/builder/vite.config.js
+++ b/packages/builder/vite.config.js
@@ -56,6 +56,10 @@ export default ({ mode }) => {
find: "stores",
replacement: path.resolve("./src/stores"),
},
+ {
+ find: "api",
+ replacement: path.resolve("./src/api.js"),
+ },
{
find: "constants",
replacement: path.resolve("./src/constants"),
diff --git a/packages/builder/yarn.lock b/packages/builder/yarn.lock
index f827c20328..f9e90c0c53 100644
--- a/packages/builder/yarn.lock
+++ b/packages/builder/yarn.lock
@@ -2,11 +2,6 @@
# yarn lockfile v1
-"@adobe/spectrum-css-workflow-icons@^1.2.1":
- version "1.2.1"
- resolved "https://registry.yarnpkg.com/@adobe/spectrum-css-workflow-icons/-/spectrum-css-workflow-icons-1.2.1.tgz#7e2cb3fcfb5c8b12d7275afafbb6ec44913551b4"
- integrity sha512-uVgekyBXnOVkxp+CUssjN/gefARtudZC8duEn1vm0lBQFwGRZFlDEzU1QC+aIRWCrD1Z8OgRpmBYlSZ7QS003w==
-
"@babel/code-frame@^7.0.0", "@babel/code-frame@^7.10.4", "@babel/code-frame@^7.16.0":
version "7.16.0"
resolved "https://registry.yarnpkg.com/@babel/code-frame/-/code-frame-7.16.0.tgz#0dfc80309beec8411e65e706461c408b0bb9b431"
@@ -920,180 +915,11 @@
resolved "https://registry.yarnpkg.com/@bcoe/v8-coverage/-/v8-coverage-0.2.3.tgz#75a2e8b51cb758a7553d6804a5932d7aace75c39"
integrity sha512-0hYQ8SB4Db5zvZB4axdMHGwEaQjkZzFjQiN9LVYvIFB2nSUHW9tYpxWriPrWDASIxiaXax83REcLxuSdnGPZtw==
-"@budibase/bbui@^0.9.139":
- version "0.9.190"
- resolved "https://registry.yarnpkg.com/@budibase/bbui/-/bbui-0.9.190.tgz#e1ec400ac90f556bfbc80fc23a04506f1585ea81"
- integrity sha512-eQg5JzN6BT4zmn1erO+iJlfYltCFODmxk11FAApKj4Pe0qZrkSDs9yZRDhwt6PPIZt+Vver8K8J/L29AmX7AIw==
- dependencies:
- "@adobe/spectrum-css-workflow-icons" "^1.2.1"
- "@spectrum-css/actionbutton" "^1.0.1"
- "@spectrum-css/actiongroup" "^1.0.1"
- "@spectrum-css/avatar" "^3.0.2"
- "@spectrum-css/button" "^3.0.1"
- "@spectrum-css/buttongroup" "^3.0.2"
- "@spectrum-css/checkbox" "^3.0.2"
- "@spectrum-css/dialog" "^3.0.1"
- "@spectrum-css/divider" "^1.0.3"
- "@spectrum-css/dropzone" "^3.0.2"
- "@spectrum-css/fieldgroup" "^3.0.2"
- "@spectrum-css/fieldlabel" "^3.0.1"
- "@spectrum-css/icon" "^3.0.1"
- "@spectrum-css/illustratedmessage" "^3.0.2"
- "@spectrum-css/inlinealert" "^2.0.1"
- "@spectrum-css/inputgroup" "^3.0.2"
- "@spectrum-css/label" "^2.0.10"
- "@spectrum-css/link" "^3.1.1"
- "@spectrum-css/menu" "^3.0.1"
- "@spectrum-css/modal" "^3.0.1"
- "@spectrum-css/pagination" "^3.0.3"
- "@spectrum-css/picker" "^1.0.1"
- "@spectrum-css/popover" "^3.0.1"
- "@spectrum-css/progressbar" "^1.0.2"
- "@spectrum-css/progresscircle" "^1.0.2"
- "@spectrum-css/radio" "^3.0.2"
- "@spectrum-css/search" "^3.0.2"
- "@spectrum-css/sidenav" "^3.0.2"
- "@spectrum-css/statuslight" "^3.0.2"
- "@spectrum-css/stepper" "^3.0.3"
- "@spectrum-css/switch" "^1.0.2"
- "@spectrum-css/table" "^3.0.1"
- "@spectrum-css/tabs" "^3.0.1"
- "@spectrum-css/tags" "^3.0.2"
- "@spectrum-css/textfield" "^3.0.1"
- "@spectrum-css/toast" "^3.0.1"
- "@spectrum-css/tooltip" "^3.0.3"
- "@spectrum-css/treeview" "^3.0.2"
- "@spectrum-css/typography" "^3.0.1"
- "@spectrum-css/underlay" "^2.0.9"
- "@spectrum-css/vars" "^3.0.1"
- dayjs "^1.10.4"
- svelte-flatpickr "^3.2.3"
- svelte-portal "^1.0.0"
-
-"@budibase/bbui@^1.0.46", "@budibase/bbui@^1.0.46-alpha.3":
- version "1.0.46"
- resolved "https://registry.yarnpkg.com/@budibase/bbui/-/bbui-1.0.46.tgz#7306d4eda7f2c827577a4affa1fd314b38ba1198"
- integrity sha512-padm0qq2SBNIslXEQW+HIv32pkIHFzloR93FDzSXh0sO43Q+/d2gbAhjI9ZUSAVncx9JNc46dolL1CwrvHFElg==
- dependencies:
- "@adobe/spectrum-css-workflow-icons" "^1.2.1"
- "@spectrum-css/actionbutton" "^1.0.1"
- "@spectrum-css/actiongroup" "^1.0.1"
- "@spectrum-css/avatar" "^3.0.2"
- "@spectrum-css/button" "^3.0.1"
- "@spectrum-css/buttongroup" "^3.0.2"
- "@spectrum-css/checkbox" "^3.0.2"
- "@spectrum-css/dialog" "^3.0.1"
- "@spectrum-css/divider" "^1.0.3"
- "@spectrum-css/dropzone" "^3.0.2"
- "@spectrum-css/fieldgroup" "^3.0.2"
- "@spectrum-css/fieldlabel" "^3.0.1"
- "@spectrum-css/icon" "^3.0.1"
- "@spectrum-css/illustratedmessage" "^3.0.2"
- "@spectrum-css/inlinealert" "^2.0.1"
- "@spectrum-css/inputgroup" "^3.0.2"
- "@spectrum-css/label" "^2.0.10"
- "@spectrum-css/link" "^3.1.1"
- "@spectrum-css/menu" "^3.0.1"
- "@spectrum-css/modal" "^3.0.1"
- "@spectrum-css/pagination" "^3.0.3"
- "@spectrum-css/picker" "^1.0.1"
- "@spectrum-css/popover" "^3.0.1"
- "@spectrum-css/progressbar" "^1.0.2"
- "@spectrum-css/progresscircle" "^1.0.2"
- "@spectrum-css/radio" "^3.0.2"
- "@spectrum-css/search" "^3.0.2"
- "@spectrum-css/sidenav" "^3.0.2"
- "@spectrum-css/statuslight" "^3.0.2"
- "@spectrum-css/stepper" "^3.0.3"
- "@spectrum-css/switch" "^1.0.2"
- "@spectrum-css/table" "^3.0.1"
- "@spectrum-css/tabs" "^3.0.1"
- "@spectrum-css/tags" "^3.0.2"
- "@spectrum-css/textfield" "^3.0.1"
- "@spectrum-css/toast" "^3.0.1"
- "@spectrum-css/tooltip" "^3.0.3"
- "@spectrum-css/treeview" "^3.0.2"
- "@spectrum-css/typography" "^3.0.1"
- "@spectrum-css/underlay" "^2.0.9"
- "@spectrum-css/vars" "^3.0.1"
- dayjs "^1.10.4"
- svelte-flatpickr "^3.2.3"
- svelte-portal "^1.0.0"
-
-"@budibase/client@^1.0.46-alpha.3":
- version "1.0.46"
- resolved "https://registry.yarnpkg.com/@budibase/client/-/client-1.0.46.tgz#e6ef8945b9d7046b6e6d6761628aa1d85387acca"
- integrity sha512-jI3z1G/EsfJNCQCvrqzsR4vR1zLoVefzCXCEASIPg9BPzdiAFSwuUJVLijLFIIKfuDVeveUll94fgu7XNY8U2w==
- dependencies:
- "@budibase/bbui" "^1.0.46"
- "@budibase/standard-components" "^0.9.139"
- "@budibase/string-templates" "^1.0.46"
- regexparam "^1.3.0"
- shortid "^2.2.15"
- svelte-spa-router "^3.0.5"
-
"@budibase/colorpicker@1.1.2":
version "1.1.2"
resolved "https://registry.yarnpkg.com/@budibase/colorpicker/-/colorpicker-1.1.2.tgz#f7436924ee746d7be9b2009c2fa193e710c30f89"
integrity sha512-2PlZBVkATDqDC4b4Ri8Xi8X3OxhuHOGfmZwtXbZL38lNIeofaQT3Qyc1ECzEY5N+HrdGrWhY9EnliF6QM+LIuA==
-"@budibase/handlebars-helpers@^0.11.7":
- version "0.11.7"
- resolved "https://registry.yarnpkg.com/@budibase/handlebars-helpers/-/handlebars-helpers-0.11.7.tgz#8e5f9843d7dd10503e9f608555a96ccf4d836c46"
- integrity sha512-PvGHAv22cWSFExs1kc0WglwsmCEUEOqWvSp6JCFZwtc3qAAr5yMfLK8WGVQ63ALvyzWZiyxF+yrlzeeaohCMJw==
- dependencies:
- array-sort "^1.0.0"
- define-property "^2.0.2"
- extend-shallow "^3.0.2"
- for-in "^1.0.2"
- get-object "^0.2.0"
- get-value "^3.0.1"
- handlebars "^4.7.7"
- handlebars-utils "^1.0.6"
- has-value "^2.0.2"
- helper-date "^1.0.1"
- helper-markdown "^1.0.0"
- helper-md "^0.2.2"
- html-tag "^2.0.0"
- is-even "^1.0.0"
- is-glob "^4.0.1"
- kind-of "^6.0.3"
- micromatch "^3.1.5"
- relative "^3.0.2"
- striptags "^3.1.1"
- to-gfm-code-block "^0.1.1"
- year "^0.2.1"
-
-"@budibase/standard-components@^0.9.139":
- version "0.9.139"
- resolved "https://registry.yarnpkg.com/@budibase/standard-components/-/standard-components-0.9.139.tgz#cf8e2b759ae863e469e50272b3ca87f2827e66e3"
- integrity sha512-Av0u9Eq2jerjhG6Atta+c0mOQGgE5K0QI3cm+8s/3Vki6/PXkO1YL5Alo3BOn9ayQAVZ/xp4rtZPuN/rzRibHw==
- dependencies:
- "@budibase/bbui" "^0.9.139"
- "@spectrum-css/button" "^3.0.3"
- "@spectrum-css/card" "^3.0.3"
- "@spectrum-css/divider" "^1.0.3"
- "@spectrum-css/link" "^3.1.3"
- "@spectrum-css/page" "^3.0.1"
- "@spectrum-css/typography" "^3.0.2"
- "@spectrum-css/vars" "^3.0.1"
- apexcharts "^3.22.1"
- dayjs "^1.10.5"
- svelte-apexcharts "^1.0.2"
- svelte-flatpickr "^3.1.0"
-
-"@budibase/string-templates@^1.0.46", "@budibase/string-templates@^1.0.46-alpha.3":
- version "1.0.46"
- resolved "https://registry.yarnpkg.com/@budibase/string-templates/-/string-templates-1.0.46.tgz#5beef1687b451e4512a465b4e143c8ab46234006"
- integrity sha512-t4ZAUkSz2XatjAN0faex5ovmD3mFz672lV/aBk7tfLFzZiKlWjngqdwpLLQNnsqeGvYo75JP2J06j86SX6O83w==
- dependencies:
- "@budibase/handlebars-helpers" "^0.11.7"
- dayjs "^1.10.4"
- handlebars "^4.7.6"
- handlebars-utils "^1.0.6"
- lodash "^4.17.20"
- vm2 "^3.9.4"
-
"@cnakazawa/watch@^1.0.3":
version "1.0.4"
resolved "https://registry.yarnpkg.com/@cnakazawa/watch/-/watch-1.0.4.tgz#f864ae85004d0fcab6f50be9141c4da368d1656a"
@@ -1102,6 +928,18 @@
exec-sh "^0.3.2"
minimist "^1.2.0"
+"@cspotcode/source-map-consumer@0.8.0":
+ version "0.8.0"
+ resolved "https://registry.yarnpkg.com/@cspotcode/source-map-consumer/-/source-map-consumer-0.8.0.tgz#33bf4b7b39c178821606f669bbc447a6a629786b"
+ integrity sha512-41qniHzTU8yAGbCp04ohlmSrZf8bkf/iJsl3V0dRGsQN/5GFfx+LbCSsCpp2gqrqjTVg/K6O8ycoV35JIwAzAg==
+
+"@cspotcode/source-map-support@0.7.0":
+ version "0.7.0"
+ resolved "https://registry.yarnpkg.com/@cspotcode/source-map-support/-/source-map-support-0.7.0.tgz#4789840aa859e46d2f3173727ab707c66bf344f5"
+ integrity sha512-X4xqRHqN8ACt2aHVe51OxeA2HjbcL4MqFqXkrmQszJ1NOUuUu5u6Vqx/0lZSVNku7velL5FC/s5uEAj1lsBMhA==
+ dependencies:
+ "@cspotcode/source-map-consumer" "0.8.0"
+
"@cypress/listr-verbose-renderer@^0.4.1":
version "0.4.1"
resolved "https://registry.yarnpkg.com/@cypress/listr-verbose-renderer/-/listr-verbose-renderer-0.4.1.tgz#a77492f4b11dcc7c446a34b3e28721afd33c642a"
@@ -1518,108 +1356,6 @@
dependencies:
"@sinonjs/commons" "^1.7.0"
-"@spectrum-css/actionbutton@^1.0.1":
- version "1.1.2"
- resolved "https://registry.yarnpkg.com/@spectrum-css/actionbutton/-/actionbutton-1.1.2.tgz#6fd58dd56b59b03a21ec4cff036d3ada94779079"
- integrity sha512-gM0Mo1A+rV9osUYoI272Do8qgFDqosmLxyD94eEJUxFRi7IU7+RpdFA+CQv9R5sonz6dDKcjcE1gTrl5XO+6Tg==
-
-"@spectrum-css/actiongroup@^1.0.1":
- version "1.0.13"
- resolved "https://registry.yarnpkg.com/@spectrum-css/actiongroup/-/actiongroup-1.0.13.tgz#f9c0cc6e5459946f17fbeee1448b4aece28d4ec4"
- integrity sha512-NNvsqxSSxOZct13dvbxhZc9B6T2fnRZNDeVsiXUnbs6O43YQCpRb1wLmGH4x93FLA/YFJAX8nUghKm7DiPCu8w==
-
-"@spectrum-css/avatar@^3.0.2":
- version "3.0.2"
- resolved "https://registry.yarnpkg.com/@spectrum-css/avatar/-/avatar-3.0.2.tgz#4f1826223eae330e64b6d3cc899e9bc2e98dac95"
- integrity sha512-wEczvSqxttTWSiL3cOvXV/RmGRwSkw2w6+slcHhnf0kb7ovymMM+9oz8vvEpEsSeo5u598bc+7ktrKFpAd6soQ==
-
-"@spectrum-css/button@^3.0.1", "@spectrum-css/button@^3.0.3":
- version "3.0.3"
- resolved "https://registry.yarnpkg.com/@spectrum-css/button/-/button-3.0.3.tgz#2df1efaab6c7e0b3b06cb4b59e1eae59c7f1fc84"
- integrity sha512-6CnLPqqtaU/PcSSIGeGRi0iFIIxIUByYLKFO6zn5NEUc12KQ28dJ4PLwB6WBa0L8vRoAGlnWWH2ZZweTijbXgg==
-
-"@spectrum-css/buttongroup@^3.0.2":
- version "3.0.10"
- resolved "https://registry.yarnpkg.com/@spectrum-css/buttongroup/-/buttongroup-3.0.10.tgz#897ea04b3ffea389fc7fe5bf67a6d1f3454b774d"
- integrity sha512-U7D24vgHYhlqOyaLJZ5LPskDAuD7cGZktmWvXtvLqG6RFyTr7JHn5oPRuo6mLzaggIHqCdJylOjZ4FHqT4LpTQ==
-
-"@spectrum-css/card@^3.0.3":
- version "3.0.3"
- resolved "https://registry.yarnpkg.com/@spectrum-css/card/-/card-3.0.3.tgz#56b2e2da6b80c1583228baa279de7407383bfb6b"
- integrity sha512-+oKLUI2a0QmQP9EzySeq/G4FpUkkdaDNbuEbqCj2IkPMc/2v/nwzsPhh1fj2UIghGAiiUwXfPpzax1e8fyhQUg==
-
-"@spectrum-css/checkbox@^3.0.2":
- version "3.0.12"
- resolved "https://registry.yarnpkg.com/@spectrum-css/checkbox/-/checkbox-3.0.12.tgz#adf20858b75790f49adbfcf79f69a5fcc2cc03c8"
- integrity sha512-5h+SxKCmeVHugvp6bZ0wuzW5nSYg0k7yUW00swNXlz3FY1IgbSLuKxpGE7id1D8tp5utfRquavJnON/F1yQSDA==
-
-"@spectrum-css/dialog@^3.0.1":
- version "3.0.12"
- resolved "https://registry.yarnpkg.com/@spectrum-css/dialog/-/dialog-3.0.12.tgz#fc97e002ca768a3d99dd10cb6a135c2b06052004"
- integrity sha512-50rbFa+9eUKT+3uYBX7CkmI7SbQ0Z3CAFwjyjai+itYZ8kf/FcHVFwcLjgrry9scUnKhexMs94kkr0gfQpPe8Q==
-
-"@spectrum-css/divider@^1.0.3":
- version "1.0.13"
- resolved "https://registry.yarnpkg.com/@spectrum-css/divider/-/divider-1.0.13.tgz#d31b368e85b53114427f765ca3bffefbd4643cd6"
- integrity sha512-b9BKy1got3Trx2HOT1D3U+H8D1vtSeRcZBbSJiluyJERpcenPr1sWiGVUZMG6Mqu2TCHTWf7lWibqnCmOWKQIw==
- dependencies:
- "@spectrum-css/vars" "^6.0.0"
-
-"@spectrum-css/dropzone@^3.0.2":
- version "3.0.13"
- resolved "https://registry.yarnpkg.com/@spectrum-css/dropzone/-/dropzone-3.0.13.tgz#c6d1004469e5e7b8d99a3a510ac7257d2236c3d4"
- integrity sha512-mkO65PlSeCNYnzVrUlC1+eCxxG4t3OgImtZ2odzKs/KAEK17ffBw4tovJ8sU6CM3EIEcNXpfjv6HULszDJRJuA==
-
-"@spectrum-css/fieldgroup@^3.0.2":
- version "3.0.12"
- resolved "https://registry.yarnpkg.com/@spectrum-css/fieldgroup/-/fieldgroup-3.0.12.tgz#1415d7bf21fcfc8e393eec6b415a18fb4314c16e"
- integrity sha512-JBmf3IZxh4TsI02K7mIK7GBzynm4j8F0dWZ9HlhCe62jgaEitmTtNm6ti9d+axMqo3L8g7fig7c/ESYV+GA2wA==
-
-"@spectrum-css/fieldlabel@^3.0.1":
- version "3.0.3"
- resolved "https://registry.yarnpkg.com/@spectrum-css/fieldlabel/-/fieldlabel-3.0.3.tgz#f73c04d20734d4718ffb620dc46458904685b449"
- integrity sha512-nEvIkEXCD5n4fW67Unq6Iu7VXoauEd/JGpfTY02VsC5p4FJLnwKfPDbJUuUsqClAxqw7nAsmXVKtn4zQFf5yPQ==
-
-"@spectrum-css/icon@^3.0.1":
- version "3.0.12"
- resolved "https://registry.yarnpkg.com/@spectrum-css/icon/-/icon-3.0.12.tgz#d45347f65139c6ed44222cbe70a069565c0c5f79"
- integrity sha512-KV2ZMOlYx5Qh06FOSDu0CyE7TL6zQAyEMUag/TX7uuiPPY/037aULEh4VsiuZgaJMEwX7dfkFzu+/aGN6CLeog==
-
-"@spectrum-css/illustratedmessage@^3.0.2":
- version "3.0.12"
- resolved "https://registry.yarnpkg.com/@spectrum-css/illustratedmessage/-/illustratedmessage-3.0.12.tgz#aa85a70ceb1fcdc0abb5aae4c90a73f01c40a31f"
- integrity sha512-T5RUf2zwc2q/VoqEabMin6jHwUBxVmsC7qm3P/fv9afp4Q7Q+3tPZnJDejP7k+F/FiSwMdOzmmX6W6ZBOWhSGw==
-
-"@spectrum-css/inlinealert@^2.0.1":
- version "2.0.6"
- resolved "https://registry.yarnpkg.com/@spectrum-css/inlinealert/-/inlinealert-2.0.6.tgz#4c5e923a1f56a96cc1adb30ef1f06ae04f2c6376"
- integrity sha512-OpvvoWP02wWyCnF4IgG8SOPkXymovkC9cGtgMS1FdDubnG3tJZB/JeKTsRR9C9Vt3WBaOmISRdSKlZ4lC9CFzA==
-
-"@spectrum-css/inputgroup@^3.0.2":
- version "3.0.8"
- resolved "https://registry.yarnpkg.com/@spectrum-css/inputgroup/-/inputgroup-3.0.8.tgz#fc23afc8a73c24d17249c9d2337e8b42085b298b"
- integrity sha512-cmQWzFp0GU+4IMc8SSeVFdmQDlRUdPelXaQdKUR9mZuO2iYettg37s0lfBCeJyYkUNTagz0zP8O7A0iXfmeE6g==
-
-"@spectrum-css/label@^2.0.10":
- version "2.0.10"
- resolved "https://registry.yarnpkg.com/@spectrum-css/label/-/label-2.0.10.tgz#2368651d7636a19385b5d300cdf6272db1916001"
- integrity sha512-xCbtEiQkZIlLdWFikuw7ifDCC21DOC/KMgVrrVJHXFc4KRQe9LTZSqmGF3tovm+CSq1adE59mYoTbojVQ9YuEQ==
-
-"@spectrum-css/link@^3.1.1", "@spectrum-css/link@^3.1.3":
- version "3.1.12"
- resolved "https://registry.yarnpkg.com/@spectrum-css/link/-/link-3.1.12.tgz#63899772b51ba2922f4297aeb37e6951fc53998e"
- integrity sha512-f8fWl/CYn5IavvdXi+XwNEthjye6e5qYy9dQZuNeRtz9oF8hFPxGT2j8yiRPIur/vyfYwFD4ZBV021cJ1g4Cjw==
-
-"@spectrum-css/menu@^3.0.1":
- version "3.0.12"
- resolved "https://registry.yarnpkg.com/@spectrum-css/menu/-/menu-3.0.12.tgz#1975b95fec98fe6daed8723fbd528f647a51d4f9"
- integrity sha512-ItOQLbXoTgzTBPw2S3sXbrEIR0clasYyHkku0kiP0XrZpt124QeXdhj+EXCdKqNM7F6tyep0K+goedq69RHsgg==
-
-"@spectrum-css/modal@^3.0.1":
- version "3.0.11"
- resolved "https://registry.yarnpkg.com/@spectrum-css/modal/-/modal-3.0.11.tgz#3838bffbb4709361b52d5f49d9fbf0e0c0eabf89"
- integrity sha512-IcyfvBU1nVN2w4NA1ExiLtMJ1kpYUHsT0GtLYC5b8MwwjOcg5MbaDMGKQqTpNUsrZWdmayP+I/Xr+ERHcdAJQg==
-
"@spectrum-css/page@^3.0.1":
version "3.0.8"
resolved "https://registry.yarnpkg.com/@spectrum-css/page/-/page-3.0.8.tgz#001efa9e4c10095df9b2b37cf7d7d6eb60140190"
@@ -1627,106 +1363,6 @@
dependencies:
"@spectrum-css/vars" "^4.3.0"
-"@spectrum-css/pagination@^3.0.3":
- version "3.0.11"
- resolved "https://registry.yarnpkg.com/@spectrum-css/pagination/-/pagination-3.0.11.tgz#68d9f34fe8eb36bf922e41b11f49eac62ac2fc41"
- integrity sha512-wjZr7NAcqHK6fxNIGKTYEVtAOJugJTbcz4d8K7DZuUDgBVwLJJHJBi4uJ4KrIRYliMWOvqWTZzCJLmmTfx4cyw==
-
-"@spectrum-css/picker@^1.0.1":
- version "1.1.7"
- resolved "https://registry.yarnpkg.com/@spectrum-css/picker/-/picker-1.1.7.tgz#d088efe91feb78143ffe4e512073127c6b35f8d5"
- integrity sha512-CiOfU5bTcc7PCWPc94alfO6SnGZD3sRvo52yMdD/RltnuUuWosU1XYuM5mPqkZ8Vok2N393wUI9C5FdvMJUHBw==
-
-"@spectrum-css/popover@^3.0.1":
- version "3.0.11"
- resolved "https://registry.yarnpkg.com/@spectrum-css/popover/-/popover-3.0.11.tgz#a7450c01bcf1609264b4a9df58821368b9e224d1"
- integrity sha512-bzyNQJVw6Mn1EBelTaRlXCdd0ZfykNX9O6SHx3a+jXPYu8VBrRpHm0gsfWzPAz1etd1vj1CxwG/teQt4qvyZ/Q==
-
-"@spectrum-css/progressbar@^1.0.2":
- version "1.0.13"
- resolved "https://registry.yarnpkg.com/@spectrum-css/progressbar/-/progressbar-1.0.13.tgz#a4e3fe1baae38d372107845c92f5d9b49168c722"
- integrity sha512-n9JeLvNvc7F8vAb6S83orhCrPsdEgdoE1g0p7n95Xhyh8647wPK9MNbjDLyYQeA7/qrJnYkBv9N/v7HdFFI8Mw==
-
-"@spectrum-css/progresscircle@^1.0.2":
- version "1.0.11"
- resolved "https://registry.yarnpkg.com/@spectrum-css/progresscircle/-/progresscircle-1.0.11.tgz#d659ba207ce3694668692e444cac3233d89ddb95"
- integrity sha512-huejk0rPLI/iyHGB+PnsY7EEgR33Z6J0BIKSm7PWx0zOGOQUJ6dNgolEPnjctMFvoPkdOosMRacTb1HwwSJ5Eg==
-
-"@spectrum-css/radio@^3.0.2":
- version "3.0.12"
- resolved "https://registry.yarnpkg.com/@spectrum-css/radio/-/radio-3.0.12.tgz#a0b0e371bab002dc8d64c5e086263aff4c51adc6"
- integrity sha512-Ix32w2H9o59dULObOL0NkPtARCGGh2xvxbPvOI72gVN9C5pmLLbf3tgN3LjH/uDuLNKta4tp+I34IHO0ipHxKQ==
-
-"@spectrum-css/search@^3.0.2":
- version "3.1.2"
- resolved "https://registry.yarnpkg.com/@spectrum-css/search/-/search-3.1.2.tgz#8d43f35f884f7c190e7694c8d26a3f2cfed01ef0"
- integrity sha512-8cMK1QB07dbReZ/ECyTyoT2dELZ7hK1b3jEDiWSeLBbXcKirR1OI24sZEnewQY/XWFd/62Z1YdNaaA8S6UuXWQ==
-
-"@spectrum-css/sidenav@^3.0.2":
- version "3.0.12"
- resolved "https://registry.yarnpkg.com/@spectrum-css/sidenav/-/sidenav-3.0.12.tgz#72ccdb91c307e199bbc6d94cf8b9b80c9f12d90e"
- integrity sha512-N9uLDg7v1vppVUFiDR9KM/OQDDgiysGqKZVou7urp52tHi21Hh+T5Hhz08v06kzIQ3gcu0UJvk/TLiNA+hohmg==
-
-"@spectrum-css/statuslight@^3.0.2":
- version "3.0.8"
- resolved "https://registry.yarnpkg.com/@spectrum-css/statuslight/-/statuslight-3.0.8.tgz#3b0ea80712573679870a85d469850230e794a0f7"
- integrity sha512-zMTHs8lk+I7fLdi9waEEbsCmJ1FxeHcjQ0yltWxuRmGk2vl4MQdQIuHIMI63iblqEaiwnJRjXJoKnWlNvndTJQ==
-
-"@spectrum-css/stepper@^3.0.3":
- version "3.0.13"
- resolved "https://registry.yarnpkg.com/@spectrum-css/stepper/-/stepper-3.0.13.tgz#1a969373f963c1e15375c50136c64144697f9d2c"
- integrity sha512-Q/TnECAR/TlPoF9Ki1X/+iDA+2s+yvXCuGvQ4VA2O2RemuAdWgO2OaguLKmocRPyo9728ykztkieT7OIOr0A0A==
-
-"@spectrum-css/switch@^1.0.2":
- version "1.0.11"
- resolved "https://registry.yarnpkg.com/@spectrum-css/switch/-/switch-1.0.11.tgz#e93c484caf3f747ec81d162f9c68b32c2774883c"
- integrity sha512-W9UZ1CGsEPCH8rN0HIMwr2JM01ixpoA8uaE3ZwOxR9XfNrKiE6EkazhINMFRiY1CQt0AD5gAG2KSFBpGcPxcJA==
-
-"@spectrum-css/table@^3.0.1":
- version "3.0.3"
- resolved "https://registry.yarnpkg.com/@spectrum-css/table/-/table-3.0.3.tgz#7f7f19905ef3275cbf907ce3a5818e63c30b2caf"
- integrity sha512-nxwzVjLPsXoY/v4sdxOVYLcC+cEbGgJyLcLclT5LT9MGSbngFeUMJzzVR4EvehzuN4dH7hrATG7Mbuq29Mf0Hg==
-
-"@spectrum-css/tabs@^3.0.1":
- version "3.1.9"
- resolved "https://registry.yarnpkg.com/@spectrum-css/tabs/-/tabs-3.1.9.tgz#be4fcd004367f2ff26d3d3d8dc7f95cd80ad6bd2"
- integrity sha512-JwEJVaqtwvXveddObfce22wC2+kVjKs/Bm7ySGzh0WJaMLPXBeBuNcjFEEc5K/3D5tzqBGQcwdGnRxjfd7NcKw==
-
-"@spectrum-css/tags@^3.0.2":
- version "3.0.3"
- resolved "https://registry.yarnpkg.com/@spectrum-css/tags/-/tags-3.0.3.tgz#fc76d2735cdc442de91b7eb3bee49a928c0767ac"
- integrity sha512-SL8vPxVDfWcY5VdIuyl0TImEXcOU1I7yCyXkk7MudMwfnYs81FaIyY32hFV9OHj0Tz/36UzRzc7AVMSuRQ53pw==
-
-"@spectrum-css/textfield@^3.0.1":
- version "3.1.3"
- resolved "https://registry.yarnpkg.com/@spectrum-css/textfield/-/textfield-3.1.3.tgz#db413eee384a2469725326950d6bfab7b0f59984"
- integrity sha512-841Su0joO+5cICgI+ysjRtkF6ZwrwQpa1moJ+C8rAiu+vEkWGqrN7ejIKf8mmj2BV0iDP3ZKmEnzCm3Nxc0XCg==
-
-"@spectrum-css/toast@^3.0.1":
- version "3.0.3"
- resolved "https://registry.yarnpkg.com/@spectrum-css/toast/-/toast-3.0.3.tgz#97c1527384707600832ecda35643ed304615250f"
- integrity sha512-CjLeaMs+cjUXojCCRtbj0YkD2BoZW16kjj2o5omkEpUTjA34IJ8xJ1a+CCtDILWekhXvN0MBN4sbumcnwcnx8w==
-
-"@spectrum-css/tooltip@^3.0.3":
- version "3.1.6"
- resolved "https://registry.yarnpkg.com/@spectrum-css/tooltip/-/tooltip-3.1.6.tgz#cbd811c2231bb9826a3e0b63d25397b6886aef70"
- integrity sha512-shIO/Z8sgG+eRP7NDBl9LR4U9ViUChW/3+C+4Li8oFOj8z1PtKqX9vOwEShjP9W8Aeg783IdhmONFA2N8W33eg==
-
-"@spectrum-css/treeview@^3.0.2":
- version "3.0.3"
- resolved "https://registry.yarnpkg.com/@spectrum-css/treeview/-/treeview-3.0.3.tgz#aeda5175158b9f8d7529cb2b394428eb2a428046"
- integrity sha512-D5gGzZC/KtRArdx86Mesc9+99W9nTbUOeyYGqoJoAfJSOttoT6Tk5CrDvlCmAqjKf5rajemAkGri1ChqvUIwkw==
-
-"@spectrum-css/typography@^3.0.1", "@spectrum-css/typography@^3.0.2":
- version "3.0.2"
- resolved "https://registry.yarnpkg.com/@spectrum-css/typography/-/typography-3.0.2.tgz#ea3ca0a60e18064527819d48c8c4364cab4fcd38"
- integrity sha512-5ZOLmQe0edzsDMyhghUd4hBb5uxGsFrxzf+WasfcUw9klSfTsRZ09n1BsaaWbgrLjlMQ+EEHS46v5VNo0Ms2CA==
-
-"@spectrum-css/underlay@^2.0.9":
- version "2.0.20"
- resolved "https://registry.yarnpkg.com/@spectrum-css/underlay/-/underlay-2.0.20.tgz#d7d511b6eb8192595d95d728b173d64cbdbd6f5f"
- integrity sha512-+IjnVVti3eE8PfxeOX7u2fR5FMi0M5X1Qu0V/+E/qTMag9Xe3TQvqs1RsGAwCJrsi/SFuIOvKMnkbCMq1FGYow==
-
"@spectrum-css/vars@^3.0.1":
version "3.0.2"
resolved "https://registry.yarnpkg.com/@spectrum-css/vars/-/vars-3.0.2.tgz#ea9062c3c98dfc6ba59e5df14a03025ad8969999"
@@ -1737,11 +1373,6 @@
resolved "https://registry.yarnpkg.com/@spectrum-css/vars/-/vars-4.3.0.tgz#03ddf67d3aa8a9a4cb0edbbd259465c9ced7e70d"
integrity sha512-ZQ2XAhgu4G9yBeXQNDAz07Z8oZNnMt5o9vzf/mpBA7Teb/JI+8qXp2wt8D245SzmtNlFkG/bzRYvQc0scgZeCQ==
-"@spectrum-css/vars@^6.0.0":
- version "6.0.0"
- resolved "https://registry.yarnpkg.com/@spectrum-css/vars/-/vars-6.0.0.tgz#a159995a3e04cb4cc5ed1475f3125c305e7956d4"
- integrity sha512-cybE4jDpw0L3GdyGDgSVQl6O7GEL9xTs5FRpJu20B1l5G9rERq3yKW8iZVyEYRtyr3iChe1idDPB5TG8bXNxqQ==
-
"@sveltejs/vite-plugin-svelte@1.0.0-next.19":
version "1.0.0-next.19"
resolved "https://registry.yarnpkg.com/@sveltejs/vite-plugin-svelte/-/vite-plugin-svelte-1.0.0-next.19.tgz#9646abc2cd1982146db4bb341aafdb5f32f19dd2"
@@ -1795,6 +1426,26 @@
resolved "https://registry.yarnpkg.com/@tootallnate/once/-/once-1.1.2.tgz#ccb91445360179a04e7fe6aff78c00ffc1eeaf82"
integrity sha512-RbzJvlNzmRq5c3O09UipeuXno4tA1FE6ikOjxZK0tuxVv3412l64l5t1W5pj4+rJq9vpkm/kwiR07aZXnsKPxw==
+"@tsconfig/node10@^1.0.7":
+ version "1.0.8"
+ resolved "https://registry.yarnpkg.com/@tsconfig/node10/-/node10-1.0.8.tgz#c1e4e80d6f964fbecb3359c43bd48b40f7cadad9"
+ integrity sha512-6XFfSQmMgq0CFLY1MslA/CPUfhIL919M1rMsa5lP2P097N2Wd1sSX0tx1u4olM16fLNhtHZpRhedZJphNJqmZg==
+
+"@tsconfig/node12@^1.0.7":
+ version "1.0.9"
+ resolved "https://registry.yarnpkg.com/@tsconfig/node12/-/node12-1.0.9.tgz#62c1f6dee2ebd9aead80dc3afa56810e58e1a04c"
+ integrity sha512-/yBMcem+fbvhSREH+s14YJi18sp7J9jpuhYByADT2rypfajMZZN4WQ6zBGgBKp53NKmqI36wFYDb3yaMPurITw==
+
+"@tsconfig/node14@^1.0.0":
+ version "1.0.1"
+ resolved "https://registry.yarnpkg.com/@tsconfig/node14/-/node14-1.0.1.tgz#95f2d167ffb9b8d2068b0b235302fafd4df711f2"
+ integrity sha512-509r2+yARFfHHE7T6Puu2jjkoycftovhXRqW328PDXTVGKihlb1P8Z9mMZH04ebyajfRY7dedfGynlrFHJUQCg==
+
+"@tsconfig/node16@^1.0.2":
+ version "1.0.2"
+ resolved "https://registry.yarnpkg.com/@tsconfig/node16/-/node16-1.0.2.tgz#423c77877d0569db20e1fc80885ac4118314010e"
+ integrity sha512-eZxlbI8GZscaGS7kkc/trHTT5xgrjH3/1n2JDwusC9iahPKWMRvRjJSAN5mCXviuTGQ/lHnhvv8Q1YTpnfz9gA==
+
"@types/aria-query@^4.2.0":
version "4.2.2"
resolved "https://registry.yarnpkg.com/@types/aria-query/-/aria-query-4.2.2.tgz#ed4e0ad92306a704f9fb132a0cfcf77486dbe2bc"
@@ -1971,6 +1622,11 @@ acorn-walk@^7.1.1:
resolved "https://registry.yarnpkg.com/acorn-walk/-/acorn-walk-7.2.0.tgz#0de889a601203909b0fbe07b8938dc21d2e967bc"
integrity sha512-OPdCF6GsMIP+Az+aWfAAOEt2/+iVDKE7oy6lJ098aoe59oAmK76qV6Gw60SbZ8jHuG2wH058GF4pLFbYamYrVA==
+acorn-walk@^8.1.1:
+ version "8.2.0"
+ resolved "https://registry.yarnpkg.com/acorn-walk/-/acorn-walk-8.2.0.tgz#741210f2e2426454508853a2f44d0ab83b7f69c1"
+ integrity sha512-k+iyHEuPgSw6SbuDpGQM+06HQUa04DZ3o+F6CSzXMvvI5KMvnaEqXe+YVe555R9nn6GPt404fos4wcgpw12SDA==
+
acorn@^7.1.1:
version "7.4.1"
resolved "https://registry.yarnpkg.com/acorn/-/acorn-7.4.1.tgz#feaed255973d2e77555b83dbc08851a6c63520fa"
@@ -1981,6 +1637,11 @@ acorn@^8.2.4:
resolved "https://registry.yarnpkg.com/acorn/-/acorn-8.5.0.tgz#4512ccb99b3698c752591e9bb4472e38ad43cee2"
integrity sha512-yXbYeFy+jUuYd3/CDcg2NkIYE991XYX/bje7LmjJigUciaeO1JR4XxXgCIV1/Zc/dRuFEyw1L0pbA+qynJkW5Q==
+acorn@^8.4.1:
+ version "8.7.0"
+ resolved "https://registry.yarnpkg.com/acorn/-/acorn-8.7.0.tgz#90951fde0f8f09df93549481e5fc141445b791cf"
+ integrity sha512-V/LGr1APy+PXIwKebEWrkZPwoeoF+w1jiOBUmuxuiUIaOHtob8Qc9BTrYo7VuI5fR8tqsy+buA2WFooR5olqvQ==
+
agent-base@6:
version "6.0.2"
resolved "https://registry.yarnpkg.com/agent-base/-/agent-base-6.0.2.tgz#49fff58577cfee3f37176feab4c22e00f86d7f77"
@@ -2070,24 +1731,17 @@ anymatch@^3.0.3:
normalize-path "^3.0.0"
picomatch "^2.0.4"
-apexcharts@^3.19.2, apexcharts@^3.22.1:
- version "3.33.0"
- resolved "https://registry.yarnpkg.com/apexcharts/-/apexcharts-3.33.0.tgz#8fb807fb6c5a55a37a1168f0dbf0548d1ae69fdb"
- integrity sha512-gOc0qZijuomtXTThLbb0sKn+mZJkVQADyK/Zw9vQ0JjKVbMYxzek61xk40hT49i1Sq6/MUqsz0WgUXYpqqf8Mg==
- dependencies:
- svg.draggable.js "^2.2.2"
- svg.easing.js "^2.0.0"
- svg.filter.js "^2.0.2"
- svg.pathmorphing.js "^0.1.3"
- svg.resize.js "^1.4.3"
- svg.select.js "^3.0.1"
-
arch@^2.1.2:
version "2.2.0"
resolved "https://registry.yarnpkg.com/arch/-/arch-2.2.0.tgz#1bc47818f305764f23ab3306b0bfc086c5a29d11"
integrity sha512-Of/R0wqp83cgHozfIYLbBMnej79U/SVGOOyuB3VVFv1NRM/PSFMK12x9KVtiYzJqmnU5WR2qp0Z5rHb7sWGnFQ==
-argparse@^1.0.10, argparse@^1.0.7:
+arg@^4.1.0:
+ version "4.1.3"
+ resolved "https://registry.yarnpkg.com/arg/-/arg-4.1.3.tgz#269fc7ad5b8e42cb63c896d5666017261c144089"
+ integrity sha512-58S9QDqG0Xx27YwPSt9fJxivjYl432YCwfDMfZ+71RAqUrZef7LrKQZ3LHLOwCS4FLNBplP533Zx895SeOCHvA==
+
+argparse@^1.0.7:
version "1.0.10"
resolved "https://registry.yarnpkg.com/argparse/-/argparse-1.0.10.tgz#bcd6791ea5ae09725e17e5ad988134cd40b3d911"
integrity sha512-o5Roy6tNG4SL/FOkCAN6RzjiakZS25RLYFrcMttJqbdd8BWrnA+fGz57iN5Pb06pvBGvl5gQ0B48dJlslXvoTg==
@@ -2117,15 +1771,6 @@ arr-union@^3.1.0:
resolved "https://registry.yarnpkg.com/arr-union/-/arr-union-3.1.0.tgz#e39b09aea9def866a8f206e288af63919bae39c4"
integrity sha1-45sJrqne+Gao8gbiiK9jkZuuOcQ=
-array-sort@^1.0.0:
- version "1.0.0"
- resolved "https://registry.yarnpkg.com/array-sort/-/array-sort-1.0.0.tgz#e4c05356453f56f53512a7d1d6123f2c54c0a88a"
- integrity sha512-ihLeJkonmdiAsD7vpgN3CRcx2J2S0TiYW+IS/5zHBI7mKUq3ySvBdzzBfD236ubDBQFiiyG3SWCPc+msQ9KoYg==
- dependencies:
- default-compare "^1.0.0"
- get-value "^2.0.6"
- kind-of "^5.0.2"
-
array-union@^2.1.0:
version "2.1.0"
resolved "https://registry.yarnpkg.com/array-union/-/array-union-2.1.0.tgz#b798420adbeb1de828d84acd8a2e23d3efe85e8d"
@@ -2173,13 +1818,6 @@ atob@^2.1.2:
resolved "https://registry.yarnpkg.com/atob/-/atob-2.1.2.tgz#6d9517eb9e030d2436666651e86bd9f6f13533c9"
integrity sha512-Wm6ukoaOGJi/73p/cl2GvLjTI5JM1k/O14isD73YML8StrH/7/lRFgmg8nICZgD3bZZvjwCGxtMOD3wWNAu8cg==
-autolinker@~0.28.0:
- version "0.28.1"
- resolved "https://registry.yarnpkg.com/autolinker/-/autolinker-0.28.1.tgz#0652b491881879f0775dace0cdca3233942a4e47"
- integrity sha1-BlK0kYgYefB3XazgzcoyM5QqTkc=
- dependencies:
- gulp-header "^1.7.1"
-
aws-sign2@~0.7.0:
version "0.7.0"
resolved "https://registry.yarnpkg.com/aws-sign2/-/aws-sign2-0.7.0.tgz#b46e890934a9591f2d2f6f86d7e6a9f1b3fe76a8"
@@ -2671,13 +2309,6 @@ concat-stream@^1.6.2:
readable-stream "^2.2.2"
typedarray "^0.0.6"
-concat-with-sourcemaps@*:
- version "1.1.0"
- resolved "https://registry.yarnpkg.com/concat-with-sourcemaps/-/concat-with-sourcemaps-1.1.0.tgz#d4ea93f05ae25790951b99e7b3b09e3908a4082e"
- integrity sha512-4gEjHJFT9e+2W/77h/DS5SGUgwDaOwprX8L/gl5+3ixnzkVJJsZWDSelmN3Oilw3LNDZjZV0yqH1hLG3k6nghg==
- dependencies:
- source-map "^0.6.1"
-
configent@^2.1.4:
version "2.2.0"
resolved "https://registry.yarnpkg.com/configent/-/configent-2.2.0.tgz#2de230fc43f22c47cfd99016aa6962d6f9546994"
@@ -2720,6 +2351,11 @@ core-util-is@~1.0.0:
resolved "https://registry.yarnpkg.com/core-util-is/-/core-util-is-1.0.3.tgz#a6042d3634c2b27e9328f837b965fac83808db85"
integrity sha512-ZQBvi1DcpJ4GDqanjucZ2Hj3wEO5pZDS89BWbkcrvdxksJorwUDDZamX9ldFkp9aw2lmBDLgkObEA4DWNJ9FYQ==
+create-require@^1.1.0:
+ version "1.1.1"
+ resolved "https://registry.yarnpkg.com/create-require/-/create-require-1.1.1.tgz#c1d7e8f1e5f6cfc9ff65f9cd352d37348756c333"
+ integrity sha512-dcKFX3jn0MpIaXjisoRvexIJVEKzaq7z2rZKxf+MSr9TkdmHmsU4m2lcLojrj/FHl8mk5VxMmYA+ftRkP/3oKQ==
+
cross-spawn@^6.0.0:
version "6.0.5"
resolved "https://registry.yarnpkg.com/cross-spawn/-/cross-spawn-6.0.5.tgz#4a5ec7c64dfae22c3a14124dbacdee846d80cbc4"
@@ -2844,18 +2480,6 @@ date-fns@^1.27.2:
resolved "https://registry.yarnpkg.com/date-fns/-/date-fns-1.30.1.tgz#2e71bf0b119153dbb4cc4e88d9ea5acfb50dc05c"
integrity sha512-hBSVCvSmWC+QypYObzwGOd9wqdDpOt+0wl0KbU+R+uuZBS1jN8VsD1ss3irQDknRj5NvxiTF6oj/nDRnN/UQNw==
-date.js@^0.3.1:
- version "0.3.3"
- resolved "https://registry.yarnpkg.com/date.js/-/date.js-0.3.3.tgz#ef1e92332f507a638795dbb985e951882e50bbda"
- integrity sha512-HgigOS3h3k6HnW011nAb43c5xx5rBXk8P2v/WIT9Zv4koIaVXiH2BURguI78VVp+5Qc076T7OR378JViCnZtBw==
- dependencies:
- debug "~3.1.0"
-
-dayjs@^1.10.4, dayjs@^1.10.5:
- version "1.10.7"
- resolved "https://registry.yarnpkg.com/dayjs/-/dayjs-1.10.7.tgz#2cf5f91add28116748440866a0a1d26f3a6ce468"
- integrity sha512-P6twpd70BcPK34K26uJ1KT3wlhpuOAPoMwJzpsIWUxHZ7wpmbdZL/hQqBDfz7hGurYSa5PhzdhDHtt319hL3ig==
-
debug@4, debug@4.3.2, debug@^4.1.0, debug@^4.1.1, debug@^4.3.2:
version "4.3.2"
resolved "https://registry.yarnpkg.com/debug/-/debug-4.3.2.tgz#f0a49c18ac8779e31d4a0c6029dfb76873c7428b"
@@ -2877,13 +2501,6 @@ debug@^3.1.0:
dependencies:
ms "^2.1.1"
-debug@~3.1.0:
- version "3.1.0"
- resolved "https://registry.yarnpkg.com/debug/-/debug-3.1.0.tgz#5bb5a0672628b64149566ba16819e61518c67261"
- integrity sha512-OX8XqP7/1a9cqkxYw2yXss15f26NKWBpDXQd0/uK/KPqdQhxbPa994hnzjcE2VqQpDslf55723cKPUOGSmMY3g==
- dependencies:
- ms "2.0.0"
-
decamelize@^1.2.0:
version "1.2.0"
resolved "https://registry.yarnpkg.com/decamelize/-/decamelize-1.2.0.tgz#f6534d15148269b20352e7bee26f501f9a191290"
@@ -2909,13 +2526,6 @@ deepmerge@^4.2.2:
resolved "https://registry.yarnpkg.com/deepmerge/-/deepmerge-4.2.2.tgz#44d2ea3679b8f4d4ffba33f03d865fc1e7bf4955"
integrity sha512-FJ3UgI4gIl+PHZm53knsuSFpE+nESMr7M4v9QcgB7S63Kj/6WqMiFQJpBBYz1Pt+66bZpP3Q7Lye0Oo9MPKEdg==
-default-compare@^1.0.0:
- version "1.0.0"
- resolved "https://registry.yarnpkg.com/default-compare/-/default-compare-1.0.0.tgz#cb61131844ad84d84788fb68fd01681ca7781a2f"
- integrity sha512-QWfXlM0EkAbqOCbD/6HjdwT19j7WCkMyiRhWilc4H9/5h/RzTF9gv5LYh1+CmDV5d1rki6KAWLtQale0xt20eQ==
- dependencies:
- kind-of "^5.0.2"
-
define-properties@^1.1.3:
version "1.1.3"
resolved "https://registry.yarnpkg.com/define-properties/-/define-properties-1.1.3.tgz#cf88da6cbee26fe6db7094f61d870cbd84cee9f1"
@@ -2965,6 +2575,11 @@ diff-sequences@^27.0.6:
resolved "https://registry.yarnpkg.com/diff-sequences/-/diff-sequences-27.0.6.tgz#3305cb2e55a033924054695cc66019fd7f8e5723"
integrity sha512-ag6wfpBFyNXZ0p8pcuIDS//D8H062ZQJ3fzYxjpmeKjnz8W4pekL3AI8VohmyZmsWW2PWaHgjsmqR6L13101VQ==
+diff@^4.0.1:
+ version "4.0.2"
+ resolved "https://registry.yarnpkg.com/diff/-/diff-4.0.2.tgz#60f3aecb89d5fae520c11aa19efc2bb982aade7d"
+ integrity sha512-58lmxKSA4BNyLz+HHMUzlOEpg09FV+ev6ZMe3vJihgdxzgcwZ8VoEEPmALCZG9LmqfVoNMMKpttIYTVG6uDY7A==
+
dir-glob@^3.0.1:
version "3.0.1"
resolved "https://registry.yarnpkg.com/dir-glob/-/dir-glob-3.0.1.tgz#56dbf73d992a4a93ba1584f4534063fd2e41717f"
@@ -3034,11 +2649,6 @@ end-of-stream@^1.1.0:
dependencies:
once "^1.4.0"
-ent@^2.2.0:
- version "2.2.0"
- resolved "https://registry.yarnpkg.com/ent/-/ent-2.2.0.tgz#e964219325a21d05f44466a2f686ed6ce5f5dd1d"
- integrity sha1-6WQhkyWiHQX0RGai9obtbOX13R0=
-
error-ex@^1.3.1:
version "1.3.2"
resolved "https://registry.yarnpkg.com/error-ex/-/error-ex-1.3.2.tgz#b4ac40648107fdcdcfae242f428bea8a14d4f1bf"
@@ -3465,11 +3075,6 @@ find-up@^4.0.0, find-up@^4.1.0:
locate-path "^5.0.0"
path-exists "^4.0.0"
-flatpickr@^4.5.2:
- version "4.6.9"
- resolved "https://registry.yarnpkg.com/flatpickr/-/flatpickr-4.6.9.tgz#9a13383e8a6814bda5d232eae3fcdccb97dc1499"
- integrity sha512-F0azNNi8foVWKSF+8X+ZJzz8r9sE1G4hl06RyceIaLvyltKvDl6vqk9Lm/6AUUCi5HWaIjiUbk7UpeE/fOXOpw==
-
fn-name@~3.0.0:
version "3.0.0"
resolved "https://registry.yarnpkg.com/fn-name/-/fn-name-3.0.0.tgz#0596707f635929634d791f452309ab41558e3c5c"
@@ -3520,11 +3125,6 @@ from@~0:
resolved "https://registry.yarnpkg.com/from/-/from-0.1.7.tgz#83c60afc58b9c56997007ed1a768b3ab303a44fe"
integrity sha1-g8YK/Fi5xWmXAH7Rp2izqzA6RP4=
-fs-exists-sync@^0.1.0:
- version "0.1.0"
- resolved "https://registry.yarnpkg.com/fs-exists-sync/-/fs-exists-sync-0.1.0.tgz#982d6893af918e72d08dec9e8673ff2b5a8d6add"
- integrity sha1-mC1ok6+RjnLQjeyehnP/K1qNat0=
-
fs-extra@^8.1.0:
version "8.1.0"
resolved "https://registry.yarnpkg.com/fs-extra/-/fs-extra-8.1.0.tgz#49d43c45a88cd9677668cb7be1b46efdb8d2e1c0"
@@ -3578,14 +3178,6 @@ get-intrinsic@^1.0.2:
has "^1.0.3"
has-symbols "^1.0.1"
-get-object@^0.2.0:
- version "0.2.0"
- resolved "https://registry.yarnpkg.com/get-object/-/get-object-0.2.0.tgz#d92ff7d5190c64530cda0543dac63a3d47fe8c0c"
- integrity sha1-2S/31RkMZFMM2gVD2sY6PUf+jAw=
- dependencies:
- is-number "^2.0.2"
- isobject "^0.2.0"
-
get-package-type@^0.1.0:
version "0.1.0"
resolved "https://registry.yarnpkg.com/get-package-type/-/get-package-type-0.1.0.tgz#8de2d803cff44df3bc6c456e6668b36c3926e11a"
@@ -3615,13 +3207,6 @@ get-value@^2.0.3, get-value@^2.0.6:
resolved "https://registry.yarnpkg.com/get-value/-/get-value-2.0.6.tgz#dc15ca1c672387ca76bd37ac0a395ba2042a2c28"
integrity sha1-3BXKHGcjh8p2vTesCjlbogQqLCg=
-get-value@^3.0.0, get-value@^3.0.1:
- version "3.0.1"
- resolved "https://registry.yarnpkg.com/get-value/-/get-value-3.0.1.tgz#5efd2a157f1d6a516d7524e124ac52d0a39ef5a8"
- integrity sha512-mKZj9JLQrwMBtj5wxi6MH8Z5eSKaERpAwjg43dPtlGI1ZVEgH/qC7T8/6R2OBSUA+zzHBZgICsVJaEIV2tKTDA==
- dependencies:
- isobject "^3.0.1"
-
getos@^3.2.1:
version "3.2.1"
resolved "https://registry.yarnpkg.com/getos/-/getos-3.2.1.tgz#0134d1f4e00eb46144c5a9c0ac4dc087cbb27dc5"
@@ -3691,35 +3276,6 @@ growly@^1.3.0:
resolved "https://registry.yarnpkg.com/growly/-/growly-1.3.0.tgz#f10748cbe76af964b7c96c93c6bcc28af120c081"
integrity sha1-8QdIy+dq+WS3yWyTxrzCivEgwIE=
-gulp-header@^1.7.1:
- version "1.8.12"
- resolved "https://registry.yarnpkg.com/gulp-header/-/gulp-header-1.8.12.tgz#ad306be0066599127281c4f8786660e705080a84"
- integrity sha512-lh9HLdb53sC7XIZOYzTXM4lFuXElv3EVkSDhsd7DoJBj7hm+Ni7D3qYbb+Rr8DuM8nRanBvkVO9d7askreXGnQ==
- dependencies:
- concat-with-sourcemaps "*"
- lodash.template "^4.4.0"
- through2 "^2.0.0"
-
-handlebars-utils@^1.0.2, handlebars-utils@^1.0.4, handlebars-utils@^1.0.6:
- version "1.0.6"
- resolved "https://registry.yarnpkg.com/handlebars-utils/-/handlebars-utils-1.0.6.tgz#cb9db43362479054782d86ffe10f47abc76357f9"
- integrity sha512-d5mmoQXdeEqSKMtQQZ9WkiUcO1E3tPbWxluCK9hVgIDPzQa9WsKo3Lbe/sGflTe7TomHEeZaOgwIkyIr1kfzkw==
- dependencies:
- kind-of "^6.0.0"
- typeof-article "^0.1.1"
-
-handlebars@^4.7.6, handlebars@^4.7.7:
- version "4.7.7"
- resolved "https://registry.yarnpkg.com/handlebars/-/handlebars-4.7.7.tgz#9ce33416aad02dbd6c8fafa8240d5d98004945a1"
- integrity sha512-aAcXm5OAfE/8IXkcZvCepKU3VzW1/39Fb5ZuqMtgI/hT8X2YgoMvBY5dLhq/cpOvw7Lk1nK/UF71aLG/ZnVYRA==
- dependencies:
- minimist "^1.2.5"
- neo-async "^2.6.0"
- source-map "^0.6.1"
- wordwrap "^1.0.0"
- optionalDependencies:
- uglify-js "^3.1.4"
-
har-schema@^2.0.0:
version "2.0.0"
resolved "https://registry.yarnpkg.com/har-schema/-/har-schema-2.0.0.tgz#a94c2224ebcac04782a0d9035521f24735b7ec92"
@@ -3778,14 +3334,6 @@ has-value@^1.0.0:
has-values "^1.0.0"
isobject "^3.0.0"
-has-value@^2.0.2:
- version "2.0.2"
- resolved "https://registry.yarnpkg.com/has-value/-/has-value-2.0.2.tgz#d0f12e8780ba8e90e66ad1a21c707fdb67c25658"
- integrity sha512-ybKOlcRsK2MqrM3Hmz/lQxXHZ6ejzSPzpNabKB45jb5qDgJvKPa3SdapTsTLwEb9WltgWpOmNax7i+DzNOk4TA==
- dependencies:
- get-value "^3.0.0"
- has-values "^2.0.1"
-
has-values@^0.1.4:
version "0.1.4"
resolved "https://registry.yarnpkg.com/has-values/-/has-values-0.1.4.tgz#6d61de95d91dfca9b9a02089ad384bff8f62b771"
@@ -3799,13 +3347,6 @@ has-values@^1.0.0:
is-number "^3.0.0"
kind-of "^4.0.0"
-has-values@^2.0.1:
- version "2.0.1"
- resolved "https://registry.yarnpkg.com/has-values/-/has-values-2.0.1.tgz#3876200ff86d8a8546a9264a952c17d5fc17579d"
- integrity sha512-+QdH3jOmq9P8GfdjFg0eJudqx1FqU62NQJ4P16rOEHeRdl7ckgwn6uqQjzYE0ZoHVV/e5E2esuJ5Gl5+HUW19w==
- dependencies:
- kind-of "^6.0.2"
-
has@^1.0.3:
version "1.0.3"
resolved "https://registry.yarnpkg.com/has/-/has-1.0.3.tgz#722d7cbfc1f6aa8241f16dd814e011e1f41e8796"
@@ -3813,39 +3354,6 @@ has@^1.0.3:
dependencies:
function-bind "^1.1.1"
-helper-date@^1.0.1:
- version "1.0.1"
- resolved "https://registry.yarnpkg.com/helper-date/-/helper-date-1.0.1.tgz#12fedea3ad8e44a7ca4c4efb0ff4104a5120cffb"
- integrity sha512-wU3VOwwTJvGr/w5rZr3cprPHO+hIhlblTJHD6aFBrKLuNbf4lAmkawd2iK3c6NbJEvY7HAmDpqjOFSI5/+Ey2w==
- dependencies:
- date.js "^0.3.1"
- handlebars-utils "^1.0.4"
- moment "^2.18.1"
-
-helper-markdown@^1.0.0:
- version "1.0.0"
- resolved "https://registry.yarnpkg.com/helper-markdown/-/helper-markdown-1.0.0.tgz#ee7e9fc554675007d37eb90f7853b13ce74f3e10"
- integrity sha512-AnDqMS4ejkQK0MXze7pA9TM3pu01ZY+XXsES6gEE0RmCGk5/NIfvTn0NmItfyDOjRAzyo9z6X7YHbHX4PzIvOA==
- dependencies:
- handlebars-utils "^1.0.2"
- highlight.js "^9.12.0"
- remarkable "^1.7.1"
-
-helper-md@^0.2.2:
- version "0.2.2"
- resolved "https://registry.yarnpkg.com/helper-md/-/helper-md-0.2.2.tgz#c1f59d7e55bbae23362fd8a0e971607aec69d41f"
- integrity sha1-wfWdflW7riM2L9ig6XFgeuxp1B8=
- dependencies:
- ent "^2.2.0"
- extend-shallow "^2.0.1"
- fs-exists-sync "^0.1.0"
- remarkable "^1.6.2"
-
-highlight.js@^9.12.0:
- version "9.18.5"
- resolved "https://registry.yarnpkg.com/highlight.js/-/highlight.js-9.18.5.tgz#d18a359867f378c138d6819edfc2a8acd5f29825"
- integrity sha512-a5bFyofd/BHCX52/8i8uJkjr9DYwXIPnM/plwI6W7ezItLGqzt7X2G2nXuYSfsIJdkwwj/g9DG1LkcGJI/dDoA==
-
hosted-git-info@^2.1.4:
version "2.8.9"
resolved "https://registry.yarnpkg.com/hosted-git-info/-/hosted-git-info-2.8.9.tgz#dffc0bf9a21c02209090f2aa69429e1414daf3f9"
@@ -3863,14 +3371,6 @@ html-escaper@^2.0.0:
resolved "https://registry.yarnpkg.com/html-escaper/-/html-escaper-2.0.2.tgz#dfd60027da36a36dfcbe236262c00a5822681453"
integrity sha512-H2iMtd0I4Mt5eYiapRdIDjp+XzelXQ0tFE4JS7YFwFevXXMmOp9myNrUvCg0D6ws8iqkRPBfKHgbwig1SmlLfg==
-html-tag@^2.0.0:
- version "2.0.0"
- resolved "https://registry.yarnpkg.com/html-tag/-/html-tag-2.0.0.tgz#36c3bc8d816fd30b570d5764a497a641640c2fed"
- integrity sha512-XxzooSo6oBoxBEUazgjdXj7VwTn/iSTSZzTYKzYY6I916tkaYzypHxy+pbVU1h+0UQ9JlVf5XkNQyxOAiiQO1g==
- dependencies:
- is-self-closing "^1.0.1"
- kind-of "^6.0.0"
-
http-proxy-agent@^4.0.1:
version "4.0.1"
resolved "https://registry.yarnpkg.com/http-proxy-agent/-/http-proxy-agent-4.0.1.tgz#8a8c8ef7f5932ccf953c296ca8291b95aa74aa3a"
@@ -4042,13 +3542,6 @@ is-docker@^2.0.0:
resolved "https://registry.yarnpkg.com/is-docker/-/is-docker-2.2.1.tgz#33eeabe23cfe86f14bde4408a02c0cfb853acdaa"
integrity sha512-F+i2BKsFrH66iaUFc0woD8sLy8getkwTwtOBjvs56Cx4CgJDeKQeqfz8wAYiSb8JOprWhHH5p77PbmYCvvUuXQ==
-is-even@^1.0.0:
- version "1.0.0"
- resolved "https://registry.yarnpkg.com/is-even/-/is-even-1.0.0.tgz#76b5055fbad8d294a86b6a949015e1c97b717c06"
- integrity sha1-drUFX7rY0pSoa2qUkBXhyXtxfAY=
- dependencies:
- is-odd "^0.1.2"
-
is-extendable@^0.1.0, is-extendable@^0.1.1:
version "0.1.1"
resolved "https://registry.yarnpkg.com/is-extendable/-/is-extendable-0.1.1.tgz#62b110e289a471418e3ec36a617d472e301dfc89"
@@ -4103,13 +3596,6 @@ is-installed-globally@^0.3.2:
global-dirs "^2.0.1"
is-path-inside "^3.0.1"
-is-number@^2.0.2:
- version "2.1.0"
- resolved "https://registry.yarnpkg.com/is-number/-/is-number-2.1.0.tgz#01fcbbb393463a548f2f466cce16dece49db908f"
- integrity sha1-Afy7s5NGOlSPL0ZszhbezknbkI8=
- dependencies:
- kind-of "^3.0.2"
-
is-number@^3.0.0:
version "3.0.0"
resolved "https://registry.yarnpkg.com/is-number/-/is-number-3.0.0.tgz#24fd6201a4782cf50561c810276afc7d12d71195"
@@ -4129,13 +3615,6 @@ is-observable@^1.1.0:
dependencies:
symbol-observable "^1.1.0"
-is-odd@^0.1.2:
- version "0.1.2"
- resolved "https://registry.yarnpkg.com/is-odd/-/is-odd-0.1.2.tgz#bc573b5ce371ef2aad6e6f49799b72bef13978a7"
- integrity sha1-vFc7XONx7yqtbm9JeZtyvvE5eKc=
- dependencies:
- is-number "^3.0.0"
-
is-path-inside@^3.0.1:
version "3.0.3"
resolved "https://registry.yarnpkg.com/is-path-inside/-/is-path-inside-3.0.3.tgz#d231362e53a07ff2b0e0ea7fed049161ffd16283"
@@ -4163,13 +3642,6 @@ is-promise@^2.1.0:
resolved "https://registry.yarnpkg.com/is-promise/-/is-promise-2.2.2.tgz#39ab959ccbf9a774cf079f7b40c7a26f763135f1"
integrity sha512-+lP4/6lKUBfQjZ2pdxThZvLUAafmZb8OAxFb8XXtiQmS35INgr85hdOGoEs124ez1FCnZJt6jau/T+alh58QFQ==
-is-self-closing@^1.0.1:
- version "1.0.1"
- resolved "https://registry.yarnpkg.com/is-self-closing/-/is-self-closing-1.0.1.tgz#5f406b527c7b12610176320338af0fa3896416e4"
- integrity sha512-E+60FomW7Blv5GXTlYee2KDrnG6srxF7Xt1SjrhWUGUEsTFIqY/nq2y3DaftCsgUMdh89V07IVfhY9KIJhLezg==
- dependencies:
- self-closing-tags "^1.0.1"
-
is-stream@^1.1.0:
version "1.1.0"
resolved "https://registry.yarnpkg.com/is-stream/-/is-stream-1.1.0.tgz#12d4a3dd4e68e0b79ceb8dbc84173ae80d91ca44"
@@ -4212,11 +3684,6 @@ isexe@^2.0.0:
resolved "https://registry.yarnpkg.com/isexe/-/isexe-2.0.0.tgz#e8fbf374dc556ff8947a10dcb0572d633f2cfa10"
integrity sha1-6PvzdNxVb/iUehDcsFctYz8s+hA=
-isobject@^0.2.0:
- version "0.2.0"
- resolved "https://registry.yarnpkg.com/isobject/-/isobject-0.2.0.tgz#a3432192f39b910b5f02cc989487836ec70aa85e"
- integrity sha1-o0MhkvObkQtfAsyYlIeDbscKqF4=
-
isobject@^2.0.0:
version "2.1.0"
resolved "https://registry.yarnpkg.com/isobject/-/isobject-2.1.0.tgz#f065561096a3f1da2ef46272f815c840d87e0c89"
@@ -4799,7 +4266,7 @@ jsprim@^1.2.2:
json-schema "0.2.3"
verror "1.10.0"
-kind-of@^3.0.2, kind-of@^3.0.3, kind-of@^3.1.0, kind-of@^3.2.0:
+kind-of@^3.0.2, kind-of@^3.0.3, kind-of@^3.2.0:
version "3.2.2"
resolved "https://registry.yarnpkg.com/kind-of/-/kind-of-3.2.2.tgz#31ea21a734bab9bbb0f32466d893aea51e4a3c64"
integrity sha1-MeohpzS6ubuw8yRm2JOupR5KPGQ=
@@ -4813,12 +4280,12 @@ kind-of@^4.0.0:
dependencies:
is-buffer "^1.1.5"
-kind-of@^5.0.0, kind-of@^5.0.2:
+kind-of@^5.0.0:
version "5.1.0"
resolved "https://registry.yarnpkg.com/kind-of/-/kind-of-5.1.0.tgz#729c91e2d857b7a419a1f9aa65685c4c33f5845d"
integrity sha512-NGEErnH6F2vUuXDh+OlbcKW7/wOcfdRHaZ7VWtqCztfHri/++YKmP51OdWeGPuqCOba6kk2OTe5d02VmTB80Pw==
-kind-of@^6.0.0, kind-of@^6.0.2, kind-of@^6.0.3:
+kind-of@^6.0.0, kind-of@^6.0.2:
version "6.0.3"
resolved "https://registry.yarnpkg.com/kind-of/-/kind-of-6.0.3.tgz#07c05034a6c349fa06e24fa35aa76db4580ce4dd"
integrity sha512-dcS1ul+9tmeD95T+x28/ehLgd9mENa3LsvDTtzm3vyBEO7RPptvAD+t44WVXaUjTBRcrpFeFlC8WCruUR456hw==
@@ -4912,11 +4379,6 @@ lodash-es@^4.17.11:
resolved "https://registry.yarnpkg.com/lodash-es/-/lodash-es-4.17.21.tgz#43e626c46e6591b7750beb2b50117390c609e3ee"
integrity sha512-mKnC+QJ9pWVzv+C4/U3rRsHapFfHvQFoFB92e52xeyGMcX6/OlIl78je1u8vePzYZSkkogMPJ2yjxxsb89cxyw==
-lodash._reinterpolate@^3.0.0:
- version "3.0.0"
- resolved "https://registry.yarnpkg.com/lodash._reinterpolate/-/lodash._reinterpolate-3.0.0.tgz#0ccf2d89166af03b3663c796538b75ac6e114d9d"
- integrity sha1-DM8tiRZq8Ds2Y8eWU4t1rG4RTZ0=
-
lodash.debounce@^4.0.8:
version "4.0.8"
resolved "https://registry.yarnpkg.com/lodash.debounce/-/lodash.debounce-4.0.8.tgz#82d79bff30a67c4005ffd5e2515300ad9ca4d7af"
@@ -4927,22 +4389,7 @@ lodash.once@^4.1.1:
resolved "https://registry.yarnpkg.com/lodash.once/-/lodash.once-4.1.1.tgz#0dd3971213c7c56df880977d504c88fb471a97ac"
integrity sha1-DdOXEhPHxW34gJd9UEyI+0cal6w=
-lodash.template@^4.4.0:
- version "4.5.0"
- resolved "https://registry.yarnpkg.com/lodash.template/-/lodash.template-4.5.0.tgz#f976195cf3f347d0d5f52483569fe8031ccce8ab"
- integrity sha512-84vYFxIkmidUiFxidA/KjjH9pAycqW+h980j7Fuz5qxRtO9pgB7MDFTdys1N7A5mcucRiDyEq4fusljItR1T/A==
- dependencies:
- lodash._reinterpolate "^3.0.0"
- lodash.templatesettings "^4.0.0"
-
-lodash.templatesettings@^4.0.0:
- version "4.2.0"
- resolved "https://registry.yarnpkg.com/lodash.templatesettings/-/lodash.templatesettings-4.2.0.tgz#e481310f049d3cf6d47e912ad09313b154f0fb33"
- integrity sha512-stgLz+i3Aa9mZgnjr/O+v9ruKZsPsndy7qPZOchbqk2cnTU1ZaldKK+v7m54WoKIyxiuMZTKT2H81F8BeAc3ZQ==
- dependencies:
- lodash._reinterpolate "^3.0.0"
-
-lodash@4.17.21, lodash@^4.17.15, lodash@^4.17.19, lodash@^4.17.20, lodash@^4.17.21, lodash@^4.7.0:
+lodash@4.17.21, lodash@^4.17.15, lodash@^4.17.19, lodash@^4.17.21, lodash@^4.7.0:
version "4.17.21"
resolved "https://registry.yarnpkg.com/lodash/-/lodash-4.17.21.tgz#679591c564c3bffaae8454cf0b3df370c3d6911c"
integrity sha512-v2kDEe57lecTulaDIuNTPy3Ry4gLGJ6Z1O3vE1krgXZNrsQ+LFTGHVxVjcXPs17LhbZVGedAJv8XZ1tvj5FvSg==
@@ -5004,6 +4451,11 @@ make-dir@^3.0.0:
dependencies:
semver "^6.0.0"
+make-error@^1.1.1:
+ version "1.3.6"
+ resolved "https://registry.yarnpkg.com/make-error/-/make-error-1.3.6.tgz#2eb2e37ea9b67c4891f684a1394799af484cf7a2"
+ integrity sha512-s8UhlNe7vPKomQhC1qFelMokr/Sc3AgNbso3n74mVPA5LTZwkB9NlXf4XPamLxJE8h0gh73rM94xvwRT2CVInw==
+
makeerror@1.0.12:
version "1.0.12"
resolved "https://registry.yarnpkg.com/makeerror/-/makeerror-1.0.12.tgz#3e5dd2079a82e812e983cc6610c4a2cb0eaa801a"
@@ -5043,7 +4495,7 @@ methods@^1.1.2:
resolved "https://registry.yarnpkg.com/methods/-/methods-1.1.2.tgz#5529a4d67654134edcc5266656835b0f851afcee"
integrity sha1-VSmk1nZUE07cxSZmVoNbD4Ua/O4=
-micromatch@^3.1.4, micromatch@^3.1.5:
+micromatch@^3.1.4:
version "3.1.10"
resolved "https://registry.yarnpkg.com/micromatch/-/micromatch-3.1.10.tgz#70859bc95c9840952f359a068a3fc49f9ecfac23"
integrity sha512-MWikgl9n9M3w+bpsY3He8L+w9eF9338xRl8IAO5viDizwSzziFEyUzo2xrrloB64ADbTf8uA8vRqqttDTOmccg==
@@ -5124,7 +4576,7 @@ mkdirp@^0.5.4:
dependencies:
minimist "^1.2.5"
-moment@^2.18.1, moment@^2.27.0:
+moment@^2.27.0:
version "2.29.1"
resolved "https://registry.yarnpkg.com/moment/-/moment-2.29.1.tgz#b2be769fa31940be9eeea6469c075e35006fa3d3"
integrity sha512-kHmoybcPV8Sqy59DwNDY3Jefr64lK/by/da0ViFcuA4DH0vQg5Q6Ze5VimxkfQNSC+Mls/Kx53s7TjP1RhFEDQ==
@@ -5181,11 +4633,6 @@ ncp@^2.0.0:
resolved "https://registry.yarnpkg.com/ncp/-/ncp-2.0.0.tgz#195a21d6c46e361d2fb1281ba38b91e9df7bdbb3"
integrity sha1-GVoh1sRuNh0vsSgbo4uR6d9727M=
-neo-async@^2.6.0:
- version "2.6.2"
- resolved "https://registry.yarnpkg.com/neo-async/-/neo-async-2.6.2.tgz#b4aafb93e3aeb2d8174ca53cf163ab7d7308305f"
- integrity sha512-Yd3UES5mWCSqR+qNT93S3UoYUkqAZ9lLg8a7g9rimsWmYGK8cVToA4/sF3RrshdyV3sAGMXVUmpMYOw+dLpOuw==
-
nice-try@^1.0.4:
version "1.0.5"
resolved "https://registry.yarnpkg.com/nice-try/-/nice-try-1.0.5.tgz#a3378a7696ce7d223e88fc9b764bd7ef1089e366"
@@ -5633,7 +5080,7 @@ read-pkg@^5.2.0:
parse-json "^5.0.0"
type-fest "^0.6.0"
-readable-stream@^2.2.2, readable-stream@~2.3.6:
+readable-stream@^2.2.2:
version "2.3.7"
resolved "https://registry.yarnpkg.com/readable-stream/-/readable-stream-2.3.7.tgz#1eca1cf711aef814c04f62252a36a62f6cb23b57"
integrity sha512-Ebho8K4jIbHAxnuxi7o42OrZgF/ZTNcsZj6nRKyUmkhLFq8CHItp/fy6hQZuZmP/n3yZ9VBUbp4zz/mX8hmYPw==
@@ -5686,16 +5133,6 @@ regex-not@^1.0.0, regex-not@^1.0.2:
extend-shallow "^3.0.2"
safe-regex "^1.1.0"
-regexparam@2.0.0:
- version "2.0.0"
- resolved "https://registry.yarnpkg.com/regexparam/-/regexparam-2.0.0.tgz#059476767d5f5f87f735fc7922d133fd1a118c8c"
- integrity sha512-gJKwd2MVPWHAIFLsaYDZfyKzHNS4o7E/v8YmNf44vmeV2e4YfVoDToTOKTvE7ab68cRJ++kLuEXJBaEeJVt5ow==
-
-regexparam@^1.3.0:
- version "1.3.0"
- resolved "https://registry.yarnpkg.com/regexparam/-/regexparam-1.3.0.tgz#2fe42c93e32a40eff6235d635e0ffa344b92965f"
- integrity sha512-6IQpFBv6e5vz1QAqI+V4k8P2e/3gRrqfCJ9FI+O1FLQTO+Uz6RXZEZOPmTJ6hlGj7gkERzY5BRCv09whKP96/g==
-
regexpu-core@^4.7.1:
version "4.8.0"
resolved "https://registry.yarnpkg.com/regexpu-core/-/regexpu-core-4.8.0.tgz#e5605ba361b67b1718478501327502f4479a98f0"
@@ -5720,21 +5157,6 @@ regjsparser@^0.7.0:
dependencies:
jsesc "~0.5.0"
-relative@^3.0.2:
- version "3.0.2"
- resolved "https://registry.yarnpkg.com/relative/-/relative-3.0.2.tgz#0dcd8ec54a5d35a3c15e104503d65375b5a5367f"
- integrity sha1-Dc2OxUpdNaPBXhBFA9ZTdbWlNn8=
- dependencies:
- isobject "^2.0.0"
-
-remarkable@^1.6.2, remarkable@^1.7.1:
- version "1.7.4"
- resolved "https://registry.yarnpkg.com/remarkable/-/remarkable-1.7.4.tgz#19073cb960398c87a7d6546eaa5e50d2022fcd00"
- integrity sha512-e6NKUXgX95whv7IgddywbeN/ItCkWbISmc2DiqHJb0wTrqZIexqdco5b8Z3XZoo/48IdNVKM9ZCvTPJ4F5uvhg==
- dependencies:
- argparse "^1.0.10"
- autolinker "~0.28.0"
-
remixicon@2.5.0:
version "2.5.0"
resolved "https://registry.yarnpkg.com/remixicon/-/remixicon-2.5.0.tgz#b5e245894a1550aa23793f95daceadbf96ad1a41"
@@ -5930,11 +5352,6 @@ saxes@^5.0.1:
dependencies:
xmlchars "^2.2.0"
-self-closing-tags@^1.0.1:
- version "1.0.1"
- resolved "https://registry.yarnpkg.com/self-closing-tags/-/self-closing-tags-1.0.1.tgz#6c5fa497994bb826b484216916371accee490a5d"
- integrity sha512-7t6hNbYMxM+VHXTgJmxwgZgLGktuXtVVD5AivWzNTdJBM4DBjnDKDzkf2SrNjihaArpeJYNjxkELBu1evI4lQA==
-
"semver@2 || 3 || 4 || 5", semver@^5.5.0:
version "5.7.1"
resolved "https://registry.yarnpkg.com/semver/-/semver-5.7.1.tgz#a954f931aeba508d307bbf069eff0c01c96116f7"
@@ -6008,13 +5425,6 @@ shortid@2.2.15:
dependencies:
nanoid "^2.1.0"
-shortid@^2.2.15:
- version "2.2.16"
- resolved "https://registry.yarnpkg.com/shortid/-/shortid-2.2.16.tgz#b742b8f0cb96406fd391c76bfc18a67a57fe5608"
- integrity sha512-Ugt+GIZqvGXCIItnsL+lvFJOiN7RYqlGy7QE41O3YC1xbNSeDGIRO7xg2JJXIAj1cAGnOeC1r7/T9pgrtQbv4g==
- dependencies:
- nanoid "^2.1.0"
-
signal-exit@^3.0.0, signal-exit@^3.0.2, signal-exit@^3.0.3:
version "3.0.5"
resolved "https://registry.yarnpkg.com/signal-exit/-/signal-exit-3.0.5.tgz#9e3e8cc0c75a99472b44321033a7702e7738252f"
@@ -6301,11 +5711,6 @@ strip-indent@^3.0.0:
dependencies:
min-indent "^1.0.0"
-striptags@^3.1.1:
- version "3.2.0"
- resolved "https://registry.yarnpkg.com/striptags/-/striptags-3.2.0.tgz#cc74a137db2de8b0b9a370006334161f7dd67052"
- integrity sha512-g45ZOGzHDMe2bdYMdIvdAfCQkCTDMGBazSw1ypMowwGIee7ZQ5dU0rBJ8Jqgl+jAKIv4dbeE1jscZq9wid1Tkw==
-
supports-color@^2.0.0:
version "2.0.0"
resolved "https://registry.yarnpkg.com/supports-color/-/supports-color-2.0.0.tgz#535d045ce6b6363fa40117084629995e9df324c7"
@@ -6333,25 +5738,11 @@ supports-hyperlinks@^2.0.0:
has-flag "^4.0.0"
supports-color "^7.0.0"
-svelte-apexcharts@^1.0.2:
- version "1.0.2"
- resolved "https://registry.yarnpkg.com/svelte-apexcharts/-/svelte-apexcharts-1.0.2.tgz#4e000f8b8f7c901c05658c845457dfc8314d54c1"
- integrity sha512-6qlx4rE+XsonZ0FZudfwqOQ34Pq+3wpxgAD75zgEmGoYhYBJcwmikTuTf3o8ZBsZue9U/pAwhNy3ed1Bkq1gmA==
- dependencies:
- apexcharts "^3.19.2"
-
svelte-dnd-action@^0.9.8:
version "0.9.12"
resolved "https://registry.yarnpkg.com/svelte-dnd-action/-/svelte-dnd-action-0.9.12.tgz#78cf33097986488c6d069eca517af473cd998730"
integrity sha512-GlXIB3/56IMR5A0+qUx+FX7Q7n8uCAIeuYdgSBmn9iOlxWc+mgM8P1kNwAKCMSTdQ4IQETVQILNgWVY1KIFzsg==
-svelte-flatpickr@^3.1.0, svelte-flatpickr@^3.2.3:
- version "3.2.6"
- resolved "https://registry.yarnpkg.com/svelte-flatpickr/-/svelte-flatpickr-3.2.6.tgz#595a97b2f25a669e61fe743f90a10dce783bbd49"
- integrity sha512-0ePUyE9OjInYFqQwRKOxnFSu4dQX9+/rzFMynq2fKYXx406ZUThzSx72gebtjr0DoAQbsH2///BBZa5qk4qZXg==
- dependencies:
- flatpickr "^4.5.2"
-
svelte-hmr@^0.14.7:
version "0.14.7"
resolved "https://registry.yarnpkg.com/svelte-hmr/-/svelte-hmr-0.14.7.tgz#7fa8261c7b225d9409f0a86f3b9ea5c3ca6f6607"
@@ -6372,78 +5763,11 @@ svelte-portal@0.1.0:
resolved "https://registry.yarnpkg.com/svelte-portal/-/svelte-portal-0.1.0.tgz#cc2821cc84b05ed5814e0218dcdfcbebc53c1742"
integrity sha512-kef+ksXVKun224mRxat+DdO4C+cGHla+fEcZfnBAvoZocwiaceOfhf5azHYOPXSSB1igWVFTEOF3CDENPnuWxg==
-svelte-portal@^1.0.0:
- version "1.0.0"
- resolved "https://registry.yarnpkg.com/svelte-portal/-/svelte-portal-1.0.0.tgz#36a47c5578b1a4d9b4dc60fa32a904640ec4cdd3"
- integrity sha512-nHf+DS/jZ6jjnZSleBMSaZua9JlG5rZv9lOGKgJuaZStfevtjIlUJrkLc3vbV8QdBvPPVmvcjTlazAzfKu0v3Q==
-
-svelte-spa-router@^3.0.5:
- version "3.2.0"
- resolved "https://registry.yarnpkg.com/svelte-spa-router/-/svelte-spa-router-3.2.0.tgz#fae3311d292451236cb57131262406cf312b15ee"
- integrity sha512-igemo5Vs82TGBBw+DjWt6qKameXYzNs6aDXcTxou5XbEvOjiRcAM6MLkdVRCatn6u8r42dE99bt/br7T4qe/AQ==
- dependencies:
- regexparam "2.0.0"
-
svelte@^3.38.2:
version "3.44.1"
resolved "https://registry.yarnpkg.com/svelte/-/svelte-3.44.1.tgz#5cc772a8340f4519a4ecd1ac1a842325466b1a63"
integrity sha512-4DrCEJoBvdR689efHNSxIQn2pnFwB7E7j2yLEJtHE/P8hxwZWIphCtJ8are7bjl/iVMlcEf5uh5pJ68IwR09vQ==
-svg.draggable.js@^2.2.2:
- version "2.2.2"
- resolved "https://registry.yarnpkg.com/svg.draggable.js/-/svg.draggable.js-2.2.2.tgz#c514a2f1405efb6f0263e7958f5b68fce50603ba"
- integrity sha512-JzNHBc2fLQMzYCZ90KZHN2ohXL0BQJGQimK1kGk6AvSeibuKcIdDX9Kr0dT9+UJ5O8nYA0RB839Lhvk4CY4MZw==
- dependencies:
- svg.js "^2.0.1"
-
-svg.easing.js@^2.0.0:
- version "2.0.0"
- resolved "https://registry.yarnpkg.com/svg.easing.js/-/svg.easing.js-2.0.0.tgz#8aa9946b0a8e27857a5c40a10eba4091e5691f12"
- integrity sha1-iqmUawqOJ4V6XEChDrpAkeVpHxI=
- dependencies:
- svg.js ">=2.3.x"
-
-svg.filter.js@^2.0.2:
- version "2.0.2"
- resolved "https://registry.yarnpkg.com/svg.filter.js/-/svg.filter.js-2.0.2.tgz#91008e151389dd9230779fcbe6e2c9a362d1c203"
- integrity sha1-kQCOFROJ3ZIwd5/L5uLJo2LRwgM=
- dependencies:
- svg.js "^2.2.5"
-
-svg.js@>=2.3.x, svg.js@^2.0.1, svg.js@^2.2.5, svg.js@^2.4.0, svg.js@^2.6.5:
- version "2.7.1"
- resolved "https://registry.yarnpkg.com/svg.js/-/svg.js-2.7.1.tgz#eb977ed4737001eab859949b4a398ee1bb79948d"
- integrity sha512-ycbxpizEQktk3FYvn/8BH+6/EuWXg7ZpQREJvgacqn46gIddG24tNNe4Son6omdXCnSOaApnpZw6MPCBA1dODA==
-
-svg.pathmorphing.js@^0.1.3:
- version "0.1.3"
- resolved "https://registry.yarnpkg.com/svg.pathmorphing.js/-/svg.pathmorphing.js-0.1.3.tgz#c25718a1cc7c36e852ecabc380e758ac09bb2b65"
- integrity sha512-49HWI9X4XQR/JG1qXkSDV8xViuTLIWm/B/7YuQELV5KMOPtXjiwH4XPJvr/ghEDibmLQ9Oc22dpWpG0vUDDNww==
- dependencies:
- svg.js "^2.4.0"
-
-svg.resize.js@^1.4.3:
- version "1.4.3"
- resolved "https://registry.yarnpkg.com/svg.resize.js/-/svg.resize.js-1.4.3.tgz#885abd248e0cd205b36b973c4b578b9a36f23332"
- integrity sha512-9k5sXJuPKp+mVzXNvxz7U0uC9oVMQrrf7cFsETznzUDDm0x8+77dtZkWdMfRlmbkEEYvUn9btKuZ3n41oNA+uw==
- dependencies:
- svg.js "^2.6.5"
- svg.select.js "^2.1.2"
-
-svg.select.js@^2.1.2:
- version "2.1.2"
- resolved "https://registry.yarnpkg.com/svg.select.js/-/svg.select.js-2.1.2.tgz#e41ce13b1acff43a7441f9f8be87a2319c87be73"
- integrity sha512-tH6ABEyJsAOVAhwcCjF8mw4crjXSI1aa7j2VQR8ZuJ37H2MBUbyeqYr5nEO7sSN3cy9AR9DUwNg0t/962HlDbQ==
- dependencies:
- svg.js "^2.2.5"
-
-svg.select.js@^3.0.1:
- version "3.0.1"
- resolved "https://registry.yarnpkg.com/svg.select.js/-/svg.select.js-3.0.1.tgz#a4198e359f3825739226415f82176a90ea5cc917"
- integrity sha512-h5IS/hKkuVCbKSieR9uQCj9w+zLHoPh+ce19bBYyqF53g6mnPB8sAtIbe1s9dh2S2fCmYX2xel1Ln3PJBbK4kw==
- dependencies:
- svg.js "^2.6.5"
-
symbol-observable@^1.1.0:
version "1.2.0"
resolved "https://registry.yarnpkg.com/symbol-observable/-/symbol-observable-1.2.0.tgz#c22688aed4eab3cdc2dfeacbb561660560a00804"
@@ -6486,14 +5810,6 @@ throttleit@^1.0.0:
resolved "https://registry.yarnpkg.com/throttleit/-/throttleit-1.0.0.tgz#9e785836daf46743145a5984b6268d828528ac6c"
integrity sha1-nnhYNtr0Z0MUWlmEtiaNgoUorGw=
-through2@^2.0.0:
- version "2.0.5"
- resolved "https://registry.yarnpkg.com/through2/-/through2-2.0.5.tgz#01c1e39eb31d07cb7d03a96a70823260b23132cd"
- integrity sha512-/mrRod8xqpA+IHSLyGCQ2s8SPHiCDEeQJSep1jqLYeEUClOFG2Qsh+4FU6G9VeqpZnGW/Su8LQGc4YKni5rYSQ==
- dependencies:
- readable-stream "~2.3.6"
- xtend "~4.0.1"
-
through@2, through@~2.3, through@~2.3.1:
version "2.3.8"
resolved "https://registry.yarnpkg.com/through/-/through-2.3.8.tgz#0dd4c9ffaabc357960b1b724115d7e0e86a2e1f5"
@@ -6516,11 +5832,6 @@ to-fast-properties@^2.0.0:
resolved "https://registry.yarnpkg.com/to-fast-properties/-/to-fast-properties-2.0.0.tgz#dc5e698cbd079265bc73e0377681a4e4e83f616e"
integrity sha1-3F5pjL0HkmW8c+A3doGk5Og/YW4=
-to-gfm-code-block@^0.1.1:
- version "0.1.1"
- resolved "https://registry.yarnpkg.com/to-gfm-code-block/-/to-gfm-code-block-0.1.1.tgz#25d045a5fae553189e9637b590900da732d8aa82"
- integrity sha1-JdBFpfrlUxielje1kJANpzLYqoI=
-
to-object-path@^0.3.0:
version "0.3.0"
resolved "https://registry.yarnpkg.com/to-object-path/-/to-object-path-0.3.0.tgz#297588b7b0e7e0ac08e04e672f85c1f4999e17af"
@@ -6587,6 +5898,24 @@ tr46@~0.0.3:
resolved "https://registry.yarnpkg.com/tr46/-/tr46-0.0.3.tgz#8184fd347dac9cdc185992f3a6622e14b9d9ab6a"
integrity sha1-gYT9NH2snNwYWZLzpmIuFLnZq2o=
+ts-node@^10.4.0:
+ version "10.4.0"
+ resolved "https://registry.yarnpkg.com/ts-node/-/ts-node-10.4.0.tgz#680f88945885f4e6cf450e7f0d6223dd404895f7"
+ integrity sha512-g0FlPvvCXSIO1JDF6S232P5jPYqBkRL9qly81ZgAOSU7rwI0stphCgd2kLiCrU9DjQCrJMWEqcNSjQL02s6d8A==
+ dependencies:
+ "@cspotcode/source-map-support" "0.7.0"
+ "@tsconfig/node10" "^1.0.7"
+ "@tsconfig/node12" "^1.0.7"
+ "@tsconfig/node14" "^1.0.0"
+ "@tsconfig/node16" "^1.0.2"
+ acorn "^8.4.1"
+ acorn-walk "^8.1.1"
+ arg "^4.1.0"
+ create-require "^1.1.0"
+ diff "^4.0.1"
+ make-error "^1.1.1"
+ yn "3.1.1"
+
tslib@^1.9.0, tslib@^1.9.3:
version "1.14.1"
resolved "https://registry.yarnpkg.com/tslib/-/tslib-1.14.1.tgz#cf2d38bdc34a134bcaf1091c41f6619e2f672d00"
@@ -6648,17 +5977,10 @@ typedarray@^0.0.6:
resolved "https://registry.yarnpkg.com/typedarray/-/typedarray-0.0.6.tgz#867ac74e3864187b1d3d47d996a78ec5c8830777"
integrity sha1-hnrHTjhkGHsdPUfZlqeOxciDB3c=
-typeof-article@^0.1.1:
- version "0.1.1"
- resolved "https://registry.yarnpkg.com/typeof-article/-/typeof-article-0.1.1.tgz#9f07e733c3fbb646ffa9e61c08debacd460e06af"
- integrity sha1-nwfnM8P7tkb/qeYcCN66zUYOBq8=
- dependencies:
- kind-of "^3.1.0"
-
-uglify-js@^3.1.4:
- version "3.14.5"
- resolved "https://registry.yarnpkg.com/uglify-js/-/uglify-js-3.14.5.tgz#cdabb7d4954231d80cb4a927654c4655e51f4859"
- integrity sha512-qZukoSxOG0urUTvjc2ERMTcAy+BiFh3weWAkeurLwjrCba73poHmG3E36XEjd/JGukMzwTL7uCxZiAexj8ppvQ==
+typescript@^4.5.5:
+ version "4.5.5"
+ resolved "https://registry.yarnpkg.com/typescript/-/typescript-4.5.5.tgz#d8c953832d28924a9e3d37c73d729c846c5896f3"
+ integrity sha512-TCTIul70LyWe6IJWT8QSYeA54WQe8EjQFU4wY52Fasj5UKx88LNYKCgBEHcOMOrFF1rKGbD8v/xcNWVUq9SymA==
unicode-canonical-property-names-ecmascript@^2.0.0:
version "2.0.0"
@@ -6801,11 +6123,6 @@ vite@^2.1.5:
optionalDependencies:
fsevents "~2.3.2"
-vm2@^3.9.4:
- version "3.9.5"
- resolved "https://registry.yarnpkg.com/vm2/-/vm2-3.9.5.tgz#5288044860b4bbace443101fcd3bddb2a0aa2496"
- integrity sha512-LuCAHZN75H9tdrAiLFf030oW7nJV5xwNMuk1ymOZwopmuK3d2H4L1Kv4+GFHgarKiLfXXLFU+7LDABHnwOkWng==
-
w3c-hr-time@^1.0.2:
version "1.0.2"
resolved "https://registry.yarnpkg.com/w3c-hr-time/-/w3c-hr-time-1.0.2.tgz#0a89cdf5cc15822df9c360543676963e0cc308cd"
@@ -6906,11 +6223,6 @@ word-wrap@~1.2.3:
resolved "https://registry.yarnpkg.com/word-wrap/-/word-wrap-1.2.3.tgz#610636f6b1f703891bd34771ccb17fb93b47079c"
integrity sha512-Hz/mrNwitNRh/HUAtM/VT/5VH+ygD6DV7mYKZAtHOrbs8U7lvPS6xf7EJKMF0uW1KJCl0H701g3ZGus+muE5vQ==
-wordwrap@^1.0.0:
- version "1.0.0"
- resolved "https://registry.yarnpkg.com/wordwrap/-/wordwrap-1.0.0.tgz#27584810891456a4171c8d0226441ade90cbcaeb"
- integrity sha1-J1hIEIkUVqQXHI0CJkQa3pDLyus=
-
wrap-ansi@^3.0.1:
version "3.0.1"
resolved "https://registry.yarnpkg.com/wrap-ansi/-/wrap-ansi-3.0.1.tgz#288a04d87eda5c286e060dfe8f135ce8d007f8ba"
@@ -6958,11 +6270,6 @@ xmlchars@^2.2.0:
resolved "https://registry.yarnpkg.com/xmlchars/-/xmlchars-2.2.0.tgz#060fe1bcb7f9c76fe2a17db86a9bc3ab894210cb"
integrity sha512-JZnDKK8B0RCDw84FNdDAIpZK+JuJw+s7Lz8nksI7SIuU3UXJJslUthsi+uWBUYOwPFwW7W7PRLRfUKpxjtjFCw==
-xtend@~4.0.1:
- version "4.0.2"
- resolved "https://registry.yarnpkg.com/xtend/-/xtend-4.0.2.tgz#bb72779f5fa465186b1f438f674fa347fdb5db54"
- integrity sha512-LKYU1iAXJXUgAXn9URjiu+MWhyUXHsvfp7mcuYm9dSUKK0/CjtrUwFAxD82/mCWbtLsGjFIad0wIsod4zrTAEQ==
-
y18n@^4.0.0:
version "4.0.3"
resolved "https://registry.yarnpkg.com/y18n/-/y18n-4.0.3.tgz#b5f259c82cd6e336921efd7bfd8bf560de9eeedf"
@@ -7006,10 +6313,10 @@ yauzl@^2.10.0:
buffer-crc32 "~0.2.3"
fd-slicer "~1.1.0"
-year@^0.2.1:
- version "0.2.1"
- resolved "https://registry.yarnpkg.com/year/-/year-0.2.1.tgz#4083ae520a318b23ec86037f3000cb892bdf9bb0"
- integrity sha1-QIOuUgoxiyPshgN/MADLiSvfm7A=
+yn@3.1.1:
+ version "3.1.1"
+ resolved "https://registry.yarnpkg.com/yn/-/yn-3.1.1.tgz#1e87401a09d767c1d5eab26a6e4c185182d2eb50"
+ integrity sha512-Ux4ygGWsu2c7isFWe8Yu1YluJmqVhxqK2cLXNQA5AcC3QfbGNpM7fu0Y8b/z16pXLnFxZYvWhd3fhBY9DLmC6Q==
yup@0.29.2:
version "0.29.2"
diff --git a/packages/cli/package.json b/packages/cli/package.json
index b3cad507c2..f2c3de30d6 100644
--- a/packages/cli/package.json
+++ b/packages/cli/package.json
@@ -1,6 +1,6 @@
{
"name": "@budibase/cli",
- "version": "1.0.49-alpha.5",
+ "version": "1.0.50-alpha.6",
"description": "Budibase CLI, for developers, self hosting and migrations.",
"main": "src/index.js",
"bin": {
diff --git a/packages/cli/yarn.lock b/packages/cli/yarn.lock
index b1f55618a0..7019ee169f 100644
--- a/packages/cli/yarn.lock
+++ b/packages/cli/yarn.lock
@@ -1583,9 +1583,9 @@ simple-concat@^1.0.0:
integrity sha512-cSFtAPtRhljv69IK0hTVZQ+OfE9nePi/rtJmw5UjHeVyVroEqJXP1sFztKUy1qU+xvz3u/sfYJLa947b7nAN2Q==
simple-get@^3.0.3:
- version "3.1.0"
- resolved "https://registry.yarnpkg.com/simple-get/-/simple-get-3.1.0.tgz#b45be062435e50d159540b576202ceec40b9c6b3"
- integrity sha512-bCR6cP+aTdScaQCnQKbPKtJOKDp/hj9EDLJo3Nw4y1QksqaovlW/bnptB6/c1e+qmNIDHRK+oXFDdEqBT8WzUA==
+ version "3.1.1"
+ resolved "https://registry.yarnpkg.com/simple-get/-/simple-get-3.1.1.tgz#cc7ba77cfbe761036fbfce3d021af25fc5584d55"
+ integrity sha512-CQ5LTKGfCpvE1K0n2us+kuMPbk/q0EKl82s4aheV9oXjFEz6W/Y7oQFVJuU6QG77hRT4Ghb5RURteF5vnWjupA==
dependencies:
decompress-response "^4.2.0"
once "^1.3.1"
diff --git a/packages/client/manifest.json b/packages/client/manifest.json
index 499e67c6a2..06dbaad660 100644
--- a/packages/client/manifest.json
+++ b/packages/client/manifest.json
@@ -1942,6 +1942,35 @@
"type": "validation/string",
"label": "Validation",
"key": "validation"
+ },
+ {
+ "type": "select",
+ "label": "Alignment",
+ "key": "align",
+ "defaultValue": "left",
+ "showInBar": true,
+ "barStyle": "buttons",
+ "options": [{
+ "label": "Left",
+ "value": "left",
+ "barIcon": "TextAlignLeft",
+ "barTitle": "Align left"
+ }, {
+ "label": "Center",
+ "value": "center",
+ "barIcon": "TextAlignCenter",
+ "barTitle": "Align center"
+ }, {
+ "label": "Right",
+ "value": "right",
+ "barIcon": "TextAlignRight",
+ "barTitle": "Align right"
+ }, {
+ "label": "Justify",
+ "value": "justify",
+ "barIcon": "TextAlignJustify",
+ "barTitle": "Justify text"
+ }]
}
]
},
@@ -2336,11 +2365,10 @@
]
},
"longformfield": {
- "name": "Rich Text",
+ "name": "Long Form Field",
"icon": "TextParagraph",
"styles": ["size"],
"editable": true,
- "illegalChildren": ["section"],
"settings": [
{
"type": "field/longform",
@@ -2363,6 +2391,27 @@
"label": "Default value",
"key": "defaultValue"
},
+ {
+ "type": "select",
+ "label": "Formatting",
+ "key": "format",
+ "placeholder": null,
+ "options": [
+ {
+ "label": "Auto",
+ "value": "auto"
+ },
+ {
+ "label": "Plain text",
+ "value": "plain"
+ },
+ {
+ "label": "Rich text (markdown)",
+ "value": "rich"
+ }
+ ],
+ "defaultValue": "auto"
+ },
{
"type": "boolean",
"label": "Disabled",
@@ -2404,6 +2453,12 @@
"key": "enableTime",
"defaultValue": true
},
+ {
+ "type": "boolean",
+ "label": "Time Only",
+ "key": "timeOnly",
+ "defaultValue": false
+ },
{
"type": "text",
"label": "Default value",
@@ -2479,6 +2534,11 @@
"label": "Placeholder",
"key": "placeholder"
},
+ {
+ "type": "text",
+ "label": "Default value",
+ "key": "defaultValue"
+ },
{
"type": "boolean",
"label": "Autocomplete",
@@ -3385,5 +3445,18 @@
"key": "validation"
}
]
+ },
+ "markdownviewer": {
+ "name": "Markdown Viewer",
+ "icon": "TaskList",
+ "styles": ["size"],
+ "editable": true,
+ "settings": [
+ {
+ "type": "text",
+ "label": "Markdown",
+ "key": "value"
+ }
+ ]
}
}
diff --git a/packages/client/package.json b/packages/client/package.json
index 9cec4b8050..ab5ce248ad 100644
--- a/packages/client/package.json
+++ b/packages/client/package.json
@@ -1,6 +1,6 @@
{
"name": "@budibase/client",
- "version": "1.0.49-alpha.5",
+ "version": "1.0.50-alpha.6",
"license": "MPL-2.0",
"module": "dist/budibase-client.js",
"main": "dist/budibase-client.js",
@@ -19,9 +19,9 @@
"dev:builder": "rollup -cw"
},
"dependencies": {
- "@budibase/bbui": "^1.0.49-alpha.5",
- "@budibase/standard-components": "^0.9.139",
- "@budibase/string-templates": "^1.0.49-alpha.5",
+ "@budibase/bbui": "^1.0.50-alpha.6",
+ "@budibase/frontend-core": "^1.0.50-alpha.6",
+ "@budibase/string-templates": "^1.0.50-alpha.6",
"regexparam": "^1.3.0",
"rollup-plugin-polyfill-node": "^0.8.0",
"shortid": "^2.2.15",
diff --git a/packages/client/rollup.config.js b/packages/client/rollup.config.js
index bde9d2325f..1aee91df42 100644
--- a/packages/client/rollup.config.js
+++ b/packages/client/rollup.config.js
@@ -57,10 +57,6 @@ export default {
find: "sdk",
replacement: path.resolve("./src/sdk"),
},
- {
- find: "builder",
- replacement: path.resolve("../builder"),
- },
],
}),
svelte({
diff --git a/packages/client/src/api/analytics.js b/packages/client/src/api/analytics.js
deleted file mode 100644
index 5a089eaa21..0000000000
--- a/packages/client/src/api/analytics.js
+++ /dev/null
@@ -1,10 +0,0 @@
-import API from "./api"
-
-/**
- * Notifies that an end user client app has been loaded.
- */
-export const pingEndUser = async () => {
- return await API.post({
- url: `/api/analytics/ping`,
- })
-}
diff --git a/packages/client/src/api/api.js b/packages/client/src/api/api.js
index 1bb12cca53..591d4a6782 100644
--- a/packages/client/src/api/api.js
+++ b/packages/client/src/api/api.js
@@ -1,110 +1,50 @@
-import { notificationStore, authStore } from "stores"
+import { createAPIClient } from "@budibase/frontend-core"
+import { notificationStore, authStore } from "../stores"
import { get } from "svelte/store"
-import { ApiVersion } from "constants"
-/**
- * API cache for cached request responses.
- */
-let cache = {}
+export const API = createAPIClient({
+ // Enable caching of cacheable endpoints to speed things up,
+ enableCaching: true,
-/**
- * Handler for API errors.
- */
-const handleError = error => {
- return { error }
-}
+ // Attach client specific headers
+ attachHeaders: headers => {
+ // Attach app ID header
+ headers["x-budibase-app-id"] = window["##BUDIBASE_APP_ID##"]
-/**
- * Performs an API call to the server.
- * App ID header is always correctly set.
- */
-const makeApiCall = async ({ method, url, body, json = true }) => {
- try {
- const requestBody = json ? JSON.stringify(body) : body
- const inBuilder = window["##BUDIBASE_IN_BUILDER##"]
- const headers = {
- Accept: "application/json",
- "x-budibase-app-id": window["##BUDIBASE_APP_ID##"],
- "x-budibase-api-version": ApiVersion,
- ...(json && { "Content-Type": "application/json" }),
- ...(!inBuilder && { "x-budibase-type": "client" }),
+ // Attach client header if not inside the builder preview
+ if (!window["##BUDIBASE_IN_BUILDER##"]) {
+ headers["x-budibase-type"] = "client"
}
- // add csrf token if authenticated
+ // Add csrf token if authenticated
const auth = get(authStore)
- if (auth && auth.csrfToken) {
+ if (auth?.csrfToken) {
headers["x-csrf-token"] = auth.csrfToken
}
+ },
- const response = await fetch(url, {
- method,
- headers,
- body: requestBody,
- credentials: "same-origin",
- })
- switch (response.status) {
- case 200:
- try {
- return await response.json()
- } catch (error) {
- return null
- }
- case 401:
- notificationStore.actions.error("Invalid credentials")
- return handleError(`Invalid credentials`)
- case 404:
- notificationStore.actions.warning("Not found")
- return handleError(`${url}: Not Found`)
- case 400:
- return handleError(`${url}: Bad Request`)
- case 403:
- notificationStore.actions.error(
- "Your session has expired, or you don't have permission to access that data"
- )
- return handleError(`${url}: Forbidden`)
- default:
- if (response.status >= 200 && response.status < 400) {
- return response.json()
- }
- return handleError(`${url} - ${response.statusText}`)
+ // Show an error notification for all API failures.
+ // We could also log these to sentry.
+ // Or we could check error.status and redirect to login on a 403 etc.
+ onError: error => {
+ const { status, method, url, message, handled } = error || {}
+
+ // Log any errors that we haven't manually handled
+ if (!handled) {
+ console.error("Unhandled error from API client", error)
+ return
}
- } catch (error) {
- return handleError(error)
- }
-}
-/**
- * Performs an API call to the server and caches the response.
- * Future invocation for this URL will return the cached result instead of
- * hitting the server again.
- */
-const makeCachedApiCall = async params => {
- const identifier = params.url
- if (!identifier) {
- return null
- }
- if (!cache[identifier]) {
- cache[identifier] = makeApiCall(params)
- cache[identifier] = await cache[identifier]
- }
- return await cache[identifier]
-}
+ // Notify all errors
+ if (message) {
+ // Don't notify if the URL contains the word analytics as it may be
+ // blocked by browser extensions
+ if (!url?.includes("analytics")) {
+ notificationStore.actions.error(message)
+ }
+ }
-/**
- * Constructs an API call function for a particular HTTP method.
- */
-const requestApiCall = method => async params => {
- const { external = false, url, cache = false } = params
- const fixedUrl = external ? url : `/${url}`.replace("//", "/")
- const enrichedParams = { ...params, method, url: fixedUrl }
- return await (cache ? makeCachedApiCall : makeApiCall)(enrichedParams)
-}
-
-export default {
- post: requestApiCall("POST"),
- put: requestApiCall("PUT"),
- get: requestApiCall("GET"),
- patch: requestApiCall("PATCH"),
- del: requestApiCall("DELETE"),
- error: handleError,
-}
+ // Log all errors to console
+ console.warn(`[Client] HTTP ${status} on ${method}:${url}\n\t${message}`)
+ },
+})
diff --git a/packages/client/src/api/app.js b/packages/client/src/api/app.js
deleted file mode 100644
index c5ee305cda..0000000000
--- a/packages/client/src/api/app.js
+++ /dev/null
@@ -1,10 +0,0 @@
-import API from "./api"
-
-/**
- * Fetches screen definition for an app.
- */
-export const fetchAppPackage = async appId => {
- return await API.get({
- url: `/api/applications/${appId}/appPackage`,
- })
-}
diff --git a/packages/client/src/api/attachments.js b/packages/client/src/api/attachments.js
deleted file mode 100644
index ed9c6fe522..0000000000
--- a/packages/client/src/api/attachments.js
+++ /dev/null
@@ -1,50 +0,0 @@
-import API from "./api"
-
-/**
- * Uploads an attachment to the server.
- */
-export const uploadAttachment = async (data, tableId = "") => {
- return await API.post({
- url: `/api/attachments/${tableId}/upload`,
- body: data,
- json: false,
- })
-}
-
-/**
- * Generates a signed URL to upload a file to an external datasource.
- */
-export const getSignedDatasourceURL = async (datasourceId, bucket, key) => {
- if (!datasourceId) {
- return null
- }
- const res = await API.post({
- url: `/api/attachments/${datasourceId}/url`,
- body: { bucket, key },
- })
- if (res.error) {
- throw "Could not generate signed upload URL"
- }
- return res
-}
-
-/**
- * Uploads a file to an external datasource.
- */
-export const externalUpload = async (datasourceId, bucket, key, data) => {
- const { signedUrl, publicUrl } = await getSignedDatasourceURL(
- datasourceId,
- bucket,
- key
- )
- const res = await API.put({
- url: signedUrl,
- body: data,
- json: false,
- external: true,
- })
- if (res?.error) {
- throw "Could not upload file to signed URL"
- }
- return { publicUrl }
-}
diff --git a/packages/client/src/api/auth.js b/packages/client/src/api/auth.js
deleted file mode 100644
index 9ac09f5571..0000000000
--- a/packages/client/src/api/auth.js
+++ /dev/null
@@ -1,45 +0,0 @@
-import API from "./api"
-import { enrichRows } from "./rows"
-import { TableNames } from "../constants"
-
-/**
- * Performs a log in request.
- */
-export const logIn = async ({ email, password }) => {
- if (!email) {
- return API.error("Please enter your email")
- }
- if (!password) {
- return API.error("Please enter your password")
- }
- return await API.post({
- url: "/api/global/auth",
- body: { username: email, password },
- })
-}
-
-/**
- * Logs the user out and invaidates their session.
- */
-export const logOut = async () => {
- return await API.post({
- url: "/api/global/auth/logout",
- })
-}
-
-/**
- * Fetches the currently logged in user object
- */
-export const fetchSelf = async () => {
- const user = await API.get({ url: "/api/self" })
- if (user && user._id) {
- if (user.roleId === "PUBLIC") {
- // Don't try to enrich a public user as it will 403
- return user
- } else {
- return (await enrichRows([user], TableNames.USERS))[0]
- }
- } else {
- return null
- }
-}
diff --git a/packages/client/src/api/automations.js b/packages/client/src/api/automations.js
deleted file mode 100644
index cb3e4623ad..0000000000
--- a/packages/client/src/api/automations.js
+++ /dev/null
@@ -1,16 +0,0 @@
-import { notificationStore } from "stores/notification"
-import API from "./api"
-
-/**
- * Executes an automation. Must have "App Action" trigger.
- */
-export const triggerAutomation = async (automationId, fields) => {
- const res = await API.post({
- url: `/api/automations/${automationId}/trigger`,
- body: { fields },
- })
- res.error
- ? notificationStore.actions.error("An error has occurred")
- : notificationStore.actions.success("Automation triggered")
- return res
-}
diff --git a/packages/client/src/api/index.js b/packages/client/src/api/index.js
index d429eb437c..5eb6b2b6f4 100644
--- a/packages/client/src/api/index.js
+++ b/packages/client/src/api/index.js
@@ -1,11 +1,9 @@
-export * from "./rows"
-export * from "./auth"
-export * from "./tables"
-export * from "./attachments"
-export * from "./views"
-export * from "./relationships"
-export * from "./routes"
-export * from "./queries"
-export * from "./app"
-export * from "./automations"
-export * from "./analytics"
+import { API } from "./api.js"
+import { patchAPI } from "./patches.js"
+
+// Certain endpoints which return rows need patched so that they transform
+// and enrich the row docs, so that they can be correctly handled by the
+// client library
+patchAPI(API)
+
+export { API }
diff --git a/packages/client/src/api/patches.js b/packages/client/src/api/patches.js
new file mode 100644
index 0000000000..faad9c81ec
--- /dev/null
+++ b/packages/client/src/api/patches.js
@@ -0,0 +1,107 @@
+import { Constants } from "@budibase/frontend-core"
+import { FieldTypes } from "../constants"
+
+export const patchAPI = API => {
+ /**
+ * Enriches rows which contain certain field types so that they can
+ * be properly displayed.
+ * The ability to create these bindings has been removed, but they will still
+ * exist in client apps to support backwards compatibility.
+ */
+ const enrichRows = async (rows, tableId) => {
+ if (!Array.isArray(rows)) {
+ return []
+ }
+ if (rows.length) {
+ const tables = {}
+ for (let row of rows) {
+ // Fall back to passed in tableId if row doesn't have it specified
+ let rowTableId = row.tableId || tableId
+ let table = tables[rowTableId]
+ if (!table) {
+ // Fetch table schema so we can check column types
+ table = await API.fetchTableDefinition(rowTableId)
+ tables[rowTableId] = table
+ }
+ const schema = table?.schema
+ if (schema) {
+ const keys = Object.keys(schema)
+ for (let key of keys) {
+ const type = schema[key].type
+ if (type === FieldTypes.LINK && Array.isArray(row[key])) {
+ // Enrich row a string join of relationship fields
+ row[`${key}_text`] =
+ row[key]
+ ?.map(option => option?.primaryDisplay)
+ .filter(option => !!option)
+ .join(", ") || ""
+ } else if (type === "attachment") {
+ // Enrich row with the first image URL for any attachment fields
+ let url = null
+ if (Array.isArray(row[key]) && row[key][0] != null) {
+ url = row[key][0].url
+ }
+ row[`${key}_first`] = url
+ }
+ }
+ }
+ }
+ }
+ return rows
+ }
+
+ // Enrich rows so they properly handle client bindings
+ const fetchSelf = API.fetchSelf
+ API.fetchSelf = async () => {
+ const user = await fetchSelf()
+ if (user && user._id) {
+ if (user.roleId === "PUBLIC") {
+ // Don't try to enrich a public user as it will 403
+ return user
+ } else {
+ return (await enrichRows([user], Constants.TableNames.USERS))[0]
+ }
+ } else {
+ return null
+ }
+ }
+ const fetchRelationshipData = API.fetchRelationshipData
+ API.fetchRelationshipData = async params => {
+ const tableId = params?.tableId
+ const rows = await fetchRelationshipData(params)
+ return await enrichRows(rows, tableId)
+ }
+ const fetchTableData = API.fetchTableData
+ API.fetchTableData = async tableId => {
+ const rows = await fetchTableData(tableId)
+ return await enrichRows(rows, tableId)
+ }
+ const searchTable = API.searchTable
+ API.searchTable = async params => {
+ const tableId = params?.tableId
+ const output = await searchTable(params)
+ return {
+ ...output,
+ rows: await enrichRows(output?.rows, tableId),
+ }
+ }
+ const fetchViewData = API.fetchViewData
+ API.fetchViewData = async params => {
+ const tableId = params?.tableId
+ const rows = await fetchViewData(params)
+ return await enrichRows(rows, tableId)
+ }
+
+ // Wipe any HBS formulae from table definitions, as these interfere with
+ // handlebars enrichment
+ const fetchTableDefinition = API.fetchTableDefinition
+ API.fetchTableDefinition = async tableId => {
+ const definition = await fetchTableDefinition(tableId)
+ Object.keys(definition?.schema || {}).forEach(field => {
+ if (definition.schema[field]?.type === "formula") {
+ delete definition.schema[field].formula
+ }
+ })
+ return definition
+ }
+}
diff --git a/packages/client/src/api/queries.js b/packages/client/src/api/queries.js
deleted file mode 100644
index e8972f657e..0000000000
--- a/packages/client/src/api/queries.js
+++ /dev/null
@@ -1,34 +0,0 @@
-import { notificationStore, dataSourceStore } from "stores"
-import API from "./api"
-
-/**
- * Executes a query against an external data connector.
- */
-export const executeQuery = async ({ queryId, pagination, parameters }) => {
- const query = await fetchQueryDefinition(queryId)
- if (query?.datasourceId == null) {
- notificationStore.actions.error("That query couldn't be found")
- return
- }
- const res = await API.post({
- url: `/api/v2/queries/${queryId}`,
- body: {
- parameters,
- pagination,
- },
- })
- if (res.error) {
- notificationStore.actions.error("An error has occurred")
- } else if (!query.readable) {
- notificationStore.actions.success("Query executed successfully")
- await dataSourceStore.actions.invalidateDataSource(query.datasourceId)
- }
- return res
-}
-
-/**
- * Fetches the definition of an external query.
- */
-export const fetchQueryDefinition = async queryId => {
- return await API.get({ url: `/api/queries/${queryId}`, cache: true })
-}
diff --git a/packages/client/src/api/relationships.js b/packages/client/src/api/relationships.js
deleted file mode 100644
index fe92bfd038..0000000000
--- a/packages/client/src/api/relationships.js
+++ /dev/null
@@ -1,14 +0,0 @@
-import API from "./api"
-import { enrichRows } from "./rows"
-
-/**
- * Fetches related rows for a certain field of a certain row.
- */
-export const fetchRelationshipData = async ({ tableId, rowId, fieldName }) => {
- if (!tableId || !rowId || !fieldName) {
- return []
- }
- const response = await API.get({ url: `/api/${tableId}/${rowId}/enrich` })
- const rows = response[fieldName] || []
- return await enrichRows(rows, tableId)
-}
diff --git a/packages/client/src/api/routes.js b/packages/client/src/api/routes.js
deleted file mode 100644
index d762461075..0000000000
--- a/packages/client/src/api/routes.js
+++ /dev/null
@@ -1,10 +0,0 @@
-import API from "./api"
-
-/**
- * Fetches available routes for the client app.
- */
-export const fetchRoutes = async () => {
- return await API.get({
- url: `/api/routing/client`,
- })
-}
diff --git a/packages/client/src/api/rows.js b/packages/client/src/api/rows.js
deleted file mode 100644
index 2d6df90e83..0000000000
--- a/packages/client/src/api/rows.js
+++ /dev/null
@@ -1,155 +0,0 @@
-import { notificationStore, dataSourceStore } from "stores"
-import API from "./api"
-import { fetchTableDefinition } from "./tables"
-import { FieldTypes } from "../constants"
-
-/**
- * Fetches data about a certain row in a table.
- */
-export const fetchRow = async ({ tableId, rowId }) => {
- if (!tableId || !rowId) {
- return
- }
- const row = await API.get({
- url: `/api/${tableId}/rows/${rowId}`,
- })
- return (await enrichRows([row], tableId))[0]
-}
-
-/**
- * Creates a row in a table.
- */
-export const saveRow = async row => {
- if (!row?.tableId) {
- return
- }
- const res = await API.post({
- url: `/api/${row.tableId}/rows`,
- body: row,
- })
- res.error
- ? notificationStore.actions.error("An error has occurred")
- : notificationStore.actions.success("Row saved")
-
- // Refresh related datasources
- await dataSourceStore.actions.invalidateDataSource(row.tableId)
-
- return res
-}
-
-/**
- * Updates a row in a table.
- */
-export const updateRow = async row => {
- if (!row?.tableId || !row?._id) {
- return
- }
- const res = await API.patch({
- url: `/api/${row.tableId}/rows`,
- body: row,
- })
- res.error
- ? notificationStore.actions.error("An error has occurred")
- : notificationStore.actions.success("Row updated")
-
- // Refresh related datasources
- await dataSourceStore.actions.invalidateDataSource(row.tableId)
-
- return res
-}
-
-/**
- * Deletes a row from a table.
- */
-export const deleteRow = async ({ tableId, rowId, revId }) => {
- if (!tableId || !rowId || !revId) {
- return
- }
- const res = await API.del({
- url: `/api/${tableId}/rows`,
- body: {
- _id: rowId,
- _rev: revId,
- },
- })
- res.error
- ? notificationStore.actions.error("An error has occurred")
- : notificationStore.actions.success("Row deleted")
-
- // Refresh related datasources
- await dataSourceStore.actions.invalidateDataSource(tableId)
-
- return res
-}
-
-/**
- * Deletes many rows from a table.
- */
-export const deleteRows = async ({ tableId, rows }) => {
- if (!tableId || !rows) {
- return
- }
- const res = await API.del({
- url: `/api/${tableId}/rows`,
- body: {
- rows,
- },
- })
- res.error
- ? notificationStore.actions.error("An error has occurred")
- : notificationStore.actions.success(`${rows.length} row(s) deleted`)
-
- // Refresh related datasources
- await dataSourceStore.actions.invalidateDataSource(tableId)
-
- return res
-}
-
-/**
- * Enriches rows which contain certain field types so that they can
- * be properly displayed.
- * The ability to create these bindings has been removed, but they will still
- * exist in client apps to support backwards compatibility.
- */
-export const enrichRows = async (rows, tableId) => {
- if (!Array.isArray(rows)) {
- return []
- }
- if (rows.length) {
- // map of tables, incase a row being loaded is not from the same table
- const tables = {}
- for (let row of rows) {
- // fallback to passed in tableId if row doesn't have it specified
- let rowTableId = row.tableId || tableId
- let table = tables[rowTableId]
- if (!table) {
- // Fetch table schema so we can check column types
- table = await fetchTableDefinition(rowTableId)
- tables[rowTableId] = table
- }
- const schema = table?.schema
- if (schema) {
- const keys = Object.keys(schema)
- for (let key of keys) {
- const type = schema[key].type
- if (type === FieldTypes.LINK && Array.isArray(row[key])) {
- // Enrich row a string join of relationship fields
- row[`${key}_text`] =
- row[key]
- ?.map(option => option?.primaryDisplay)
- .filter(option => !!option)
- .join(", ") || ""
- } else if (type === "attachment") {
- // Enrich row with the first image URL for any attachment fields
- let url = null
- if (Array.isArray(row[key]) && row[key][0] != null) {
- url = row[key][0].url
- }
- row[`${key}_first`] = url
- }
- }
- }
- }
- }
- return rows
-}
diff --git a/packages/client/src/api/tables.js b/packages/client/src/api/tables.js
deleted file mode 100644
index 09f77de6ee..0000000000
--- a/packages/client/src/api/tables.js
+++ /dev/null
@@ -1,63 +0,0 @@
-import API from "./api"
-import { enrichRows } from "./rows"
-
-/**
- * Fetches a table definition.
- * Since definitions cannot change at runtime, the result is cached.
- */
-export const fetchTableDefinition = async tableId => {
- const res = await API.get({ url: `/api/tables/${tableId}`, cache: true })
-
- // Wipe any HBS formulae, as these interfere with handlebars enrichment
- Object.keys(res?.schema || {}).forEach(field => {
- if (res.schema[field]?.type === "formula") {
- delete res.schema[field].formula
- }
- })
-
- return res
-}
-
-/**
- * Fetches all rows from a table.
- */
-export const fetchTableData = async tableId => {
- const rows = await API.get({ url: `/api/${tableId}/rows` })
- return await enrichRows(rows, tableId)
-}
-
-/**
- * Searches a table using Lucene.
- */
-export const searchTable = async ({
- tableId,
- query,
- bookmark,
- limit,
- sort,
- sortOrder,
- sortType,
- paginate,
-}) => {
- if (!tableId || !query) {
- return {
- rows: [],
- }
- }
- const res = await API.post({
- url: `/api/${tableId}/search`,
- body: {
- query,
- bookmark,
- limit,
- sort,
- sortOrder,
- sortType,
- paginate,
- },
- })
- return {
- ...res,
- rows: await enrichRows(res?.rows, tableId),
- }
-}
diff --git a/packages/client/src/api/views.js b/packages/client/src/api/views.js
deleted file mode 100644
index d173e53d53..0000000000
--- a/packages/client/src/api/views.js
+++ /dev/null
@@ -1,30 +0,0 @@
-import API from "./api"
-import { enrichRows } from "./rows"
-
-/**
- * Fetches all rows in a view.
- */
-export const fetchViewData = async ({
- name,
- field,
- groupBy,
- calculation,
- tableId,
-}) => {
- const params = new URLSearchParams()
-
- if (calculation) {
- params.set("field", field)
- params.set("calculation", calculation)
- }
- if (groupBy) {
- params.set("group", groupBy ? "true" : "false")
- }
-
- const QUERY_VIEW_URL = field
- ? `/api/views/${name}?${params}`
- : `/api/views/${name}`
-
- const rows = await API.get({ url: QUERY_VIEW_URL })
- return await enrichRows(rows, tableId)
-}
diff --git a/packages/client/src/components/ClientApp.svelte b/packages/client/src/components/ClientApp.svelte
index 7f5bed210e..5bd5d2d46f 100644
--- a/packages/client/src/components/ClientApp.svelte
+++ b/packages/client/src/components/ClientApp.svelte
@@ -2,6 +2,8 @@
import { writable, get } from "svelte/store"
import { setContext, onMount } from "svelte"
import { Layout, Heading, Body } from "@budibase/bbui"
+ import ErrorSVG from "@budibase/frontend-core/assets/error.svg"
+ import { Constants, CookieUtils } from "@budibase/frontend-core"
import Component from "./Component.svelte"
import SDK from "sdk"
import {
@@ -24,7 +26,6 @@
import HoverIndicator from "components/preview/HoverIndicator.svelte"
import CustomThemeWrapper from "./CustomThemeWrapper.svelte"
import DNDHandler from "components/preview/DNDHandler.svelte"
- import ErrorSVG from "builder/assets/error.svg"
import KeyboardManager from "components/preview/KeyboardManager.svelte"
// Provide contexts
@@ -63,9 +64,8 @@
} else {
// The user is not logged in, redirect them to login
const returnUrl = `${window.location.pathname}${window.location.hash}`
- // TODO: reuse `Cookies` from builder when frontend-core is added
- window.document.cookie = `budibase:returnurl=${returnUrl}; Path=/`
- window.location = `/builder/auth/login`
+ CookieUtils.setCookie(Constants.Cookies.ReturnUrl, returnUrl)
+ window.location = "/builder/auth/login"
}
}
}
diff --git a/packages/client/src/components/Component.svelte b/packages/client/src/components/Component.svelte
index 8cd1849336..f43c2b30ec 100644
--- a/packages/client/src/components/Component.svelte
+++ b/packages/client/src/components/Component.svelte
@@ -9,7 +9,7 @@
import Router from "./Router.svelte"
import { enrichProps, propsAreSame } from "utils/componentProps"
import { builderStore } from "stores"
- import { hashString } from "utils/helpers"
+ import { Helpers } from "@budibase/bbui"
import Manifest from "manifest.json"
import { getActiveConditions, reduceConditionActions } from "utils/conditions"
import Placeholder from "components/app/Placeholder.svelte"
@@ -106,7 +106,7 @@
// Raw settings are all settings excluding internal props and children
$: rawSettings = getRawSettings(instance)
- $: instanceKey = hashString(JSON.stringify(rawSettings))
+ $: instanceKey = Helpers.hashString(JSON.stringify(rawSettings))
// Update and enrich component settings
$: updateSettings(rawSettings, instanceKey, settingsDefinition, $context)
@@ -118,9 +118,6 @@
// Build up the final settings object to be passed to the component
$: cacheSettings(enrichedSettings, nestedSettings, conditionalSettings)
- // Render key is used to determine when components need to fully remount
- $: renderKey = getRenderKey(id, editing)
-
// Update component context
$: componentStore.set({
id,
@@ -276,8 +273,7 @@
// reactive statements as much as possible.
const cacheSettings = (enriched, nested, conditional) => {
const allSettings = { ...enriched, ...nested, ...conditional }
- const mounted = ref?.$$set != null
- if (!cachedSettings || !mounted) {
+ if (!cachedSettings) {
cachedSettings = { ...allSettings }
initialSettings = cachedSettings
} else {
@@ -290,51 +286,54 @@
// setting it on initialSettings directly, we avoid a double render.
cachedSettings[key] = allSettings[key]
- // Programmatically set the prop to avoid svelte reactive statements
- // firing inside components. This circumvents the problems caused by
- // spreading a props object.
- ref.$$set({ [key]: allSettings[key] })
+ if (ref?.$$set) {
+ // Programmatically set the prop to avoid svelte reactive statements
+ // firing inside components. This circumvents the problems caused by
+ // spreading a props object.
+ ref.$$set({ [key]: allSettings[key] })
+ } else {
+ // Sometimes enrichment can occur multiple times before the
+ // component has mounted and been assigned a ref.
+ // In these cases, for some reason we need to update the
+ // initial settings object, even though it is equivalent by
+ // reference to cached settings. This solves the problem of multiple
+ // initial enrichments, while also not causing wasted renders for
+ // any components not affected by this issue.
+ initialSettings[key] = allSettings[key]
+ }
}
})
}
}
-
- // Generates a key used to determine when components need to fully remount.
- // Currently only toggling editing requires remounting.
- const getRenderKey = (id, editing) => {
- return hashString(`${id}-${editing}`)
- }
-{#key renderKey}
- {#if constructor && initialSettings && (visible || inSelectedPath)}
-
-
-
-
- {#if children.length}
- {#each children as child (child._id)}
-
- {/each}
- {:else if emptyState}
-
- {:else if isBlock}
-
- {/if}
-
-
- {/if}
-{/key}
+{#if constructor && initialSettings && (visible || inSelectedPath)}
+
+
+
+
+ {#if children.length}
+ {#each children as child (child._id)}
+
+ {/each}
+ {:else if emptyState}
+
+ {:else if isBlock}
+
+ {/if}
+
+
+{/if}
diff --git a/packages/client/src/components/app/forms/RelationshipField.svelte b/packages/client/src/components/app/forms/RelationshipField.svelte
index dbf708f893..6089939dcd 100644
--- a/packages/client/src/components/app/forms/RelationshipField.svelte
+++ b/packages/client/src/components/app/forms/RelationshipField.svelte
@@ -12,6 +12,7 @@
export let disabled = false
export let validation
export let autocomplete = false
+ export let defaultValue
let fieldState
let fieldApi
@@ -27,20 +28,25 @@
$: singleValue = flatten(fieldState?.value)?.[0]
$: multiValue = flatten(fieldState?.value) ?? []
$: component = multiselect ? CoreMultiselect : CoreSelect
+ $: expandedDefaultValue = expand(defaultValue)
const fetchTable = async id => {
if (id) {
- const result = await API.fetchTableDefinition(id)
- if (!result.error) {
- tableDefinition = result
+ try {
+ tableDefinition = await API.fetchTableDefinition(id)
+ } catch (error) {
+ tableDefinition = null
}
}
}
const fetchRows = async id => {
if (id) {
- const rows = await API.fetchTableData(id)
- options = rows && !rows.error ? rows : []
+ try {
+ options = await API.fetchTableData(id)
+ } catch (error) {
+ options = []
+ }
}
}
@@ -62,6 +68,16 @@
const multiHandler = e => {
fieldApi.setValue(e.detail)
}
+
+ const expand = values => {
+ if (!values) {
+ return []
+ }
+ if (Array.isArray(values)) {
+ return values
+ }
+ return values.split(",").map(value => value.trim())
+ }
{#if fieldState}
{
loading = true
try {
- const res = await API.externalUpload(datasourceId, bucket, key, data)
+ const res = await API.externalUpload({
+ datasourceId,
+ bucket,
+ key,
+ data,
+ })
notificationStore.actions.success("File uploaded successfully")
loading = false
return res
diff --git a/packages/client/src/components/app/forms/StringField.svelte b/packages/client/src/components/app/forms/StringField.svelte
index 4764cba4d3..bb598bb1e0 100644
--- a/packages/client/src/components/app/forms/StringField.svelte
+++ b/packages/client/src/components/app/forms/StringField.svelte
@@ -9,6 +9,7 @@
export let disabled = false
export let validation
export let defaultValue = ""
+ export let align
let fieldState
let fieldApi
@@ -34,6 +35,7 @@
id={fieldState.fieldId}
{placeholder}
{type}
+ {align}
/>
{/if}
diff --git a/packages/client/src/components/app/index.js b/packages/client/src/components/app/index.js
index ef0f96ce59..5af62201e5 100644
--- a/packages/client/src/components/app/index.js
+++ b/packages/client/src/components/app/index.js
@@ -30,6 +30,7 @@ export { default as daterangepicker } from "./DateRangePicker.svelte"
export { default as cardstat } from "./CardStat.svelte"
export { default as spectrumcard } from "./SpectrumCard.svelte"
export { default as tag } from "./Tag.svelte"
+export { default as markdownviewer } from "./MarkdownViewer.svelte"
export * from "./charts"
export * from "./forms"
export * from "./table"
diff --git a/packages/client/src/components/context/UserBindingsProvider.svelte b/packages/client/src/components/context/UserBindingsProvider.svelte
index fb0dffcb68..e788d80dc4 100644
--- a/packages/client/src/components/context/UserBindingsProvider.svelte
+++ b/packages/client/src/components/context/UserBindingsProvider.svelte
@@ -1,7 +1,8 @@
diff --git a/packages/client/src/components/overlay/NotificationDisplay.svelte b/packages/client/src/components/overlay/NotificationDisplay.svelte
index 6e8be21647..667f706ff2 100644
--- a/packages/client/src/components/overlay/NotificationDisplay.svelte
+++ b/packages/client/src/components/overlay/NotificationDisplay.svelte
@@ -19,6 +19,8 @@
type={$notificationStore.type}
message={$notificationStore.message}
icon={$notificationStore.icon}
+ dismissable={$notificationStore.dismissable}
+ on:dismiss={notificationStore.actions.dismiss}
/>
{/key}
diff --git a/packages/client/src/components/overlay/PeekScreenDisplay.svelte b/packages/client/src/components/overlay/PeekScreenDisplay.svelte
index 7d3531d236..72ea58c194 100644
--- a/packages/client/src/components/overlay/PeekScreenDisplay.svelte
+++ b/packages/client/src/components/overlay/PeekScreenDisplay.svelte
@@ -25,8 +25,8 @@
}
const proxyNotification = event => {
- const { message, type, icon } = event.detail
- notificationStore.actions.send(message, type, icon)
+ const { message, type, icon, autoDismiss } = event.detail
+ notificationStore.actions.send(message, type, icon, autoDismiss)
}
const proxyStateUpdate = event => {
diff --git a/packages/client/src/components/preview/DNDHandler.svelte b/packages/client/src/components/preview/DNDHandler.svelte
index 82828b1258..ca083dd01e 100644
--- a/packages/client/src/components/preview/DNDHandler.svelte
+++ b/packages/client/src/components/preview/DNDHandler.svelte
@@ -143,7 +143,7 @@
// Callback when entering a potential drop target
const onDragEnter = e => {
// Skip if we aren't validly dragging currently
- if (!dragInfo) {
+ if (!dragInfo || !e.target.closest) {
return
}
diff --git a/packages/client/src/constants.js b/packages/client/src/constants.js
index 9d20177b52..965ca788e1 100644
--- a/packages/client/src/constants.js
+++ b/packages/client/src/constants.js
@@ -1,7 +1,3 @@
-export const TableNames = {
- USERS: "ta_users",
-}
-
export const FieldTypes = {
STRING: "string",
LONGFORM: "longform",
@@ -32,11 +28,3 @@ export const ActionTypes = {
ClearForm: "ClearForm",
ChangeFormStep: "ChangeFormStep",
}
-
-export const ApiVersion = "1"
-
-/**
- * API Version Changelog
- * v1:
- * - Coerce types for search endpoint
- */
diff --git a/packages/client/src/sdk.js b/packages/client/src/sdk.js
index 9803730541..4851b2cc02 100644
--- a/packages/client/src/sdk.js
+++ b/packages/client/src/sdk.js
@@ -1,4 +1,4 @@
-import * as API from "./api"
+import { API } from "api"
import {
authStore,
notificationStore,
@@ -10,9 +10,9 @@ import {
import { styleable } from "utils/styleable"
import { linkable } from "utils/linkable"
import { getAction } from "utils/getAction"
-import { fetchDatasourceSchema } from "utils/schema.js"
import Provider from "components/context/Provider.svelte"
-import { ActionTypes } from "constants"
+import { ActionTypes } from "./constants"
+import { fetchDatasourceSchema } from "./utils/schema.js"
export default {
API,
diff --git a/packages/client/src/stores/app.js b/packages/client/src/stores/app.js
index 0cabaec4ab..a28a4cd9eb 100644
--- a/packages/client/src/stores/app.js
+++ b/packages/client/src/stores/app.js
@@ -1,8 +1,8 @@
-import * as API from "../api"
+import { API } from "api"
import { get, writable } from "svelte/store"
const createAppStore = () => {
- const store = writable({})
+ const store = writable(null)
// Fetches the app definition including screens, layouts and theme
const fetchAppDefinition = async () => {
@@ -10,17 +10,25 @@ const createAppStore = () => {
if (!appId) {
throw "Cannot fetch app definition without app ID set"
}
- const appDefinition = await API.fetchAppPackage(appId)
- store.set({
- ...appDefinition,
- appId: appDefinition?.application?.appId,
- })
+ try {
+ const appDefinition = await API.fetchAppPackage(appId)
+ store.set({
+ ...appDefinition,
+ appId: appDefinition?.application?.appId,
+ })
+ } catch (error) {
+ store.set(null)
+ }
}
// Sets the initial app ID
const setAppID = id => {
store.update(state => {
- state.appId = id
+ if (state) {
+ state.appId = id
+ } else {
+ state = { appId: id }
+ }
return state
})
}
diff --git a/packages/client/src/stores/auth.js b/packages/client/src/stores/auth.js
index 9cd2613e24..39f11319cf 100644
--- a/packages/client/src/stores/auth.js
+++ b/packages/client/src/stores/auth.js
@@ -1,4 +1,4 @@
-import * as API from "../api"
+import { API } from "api"
import { writable } from "svelte/store"
const createAuthStore = () => {
@@ -6,8 +6,12 @@ const createAuthStore = () => {
// Fetches the user object if someone is logged in and has reloaded the page
const fetchUser = async () => {
- const user = await API.fetchSelf()
- store.set(user)
+ try {
+ const user = await API.fetchSelf()
+ store.set(user)
+ } catch (error) {
+ store.set(null)
+ }
}
const logOut = async () => {
diff --git a/packages/client/src/stores/builder.js b/packages/client/src/stores/builder.js
index 35fb3edae2..719909b538 100644
--- a/packages/client/src/stores/builder.js
+++ b/packages/client/src/stores/builder.js
@@ -1,7 +1,7 @@
import { writable, derived, get } from "svelte/store"
import Manifest from "manifest.json"
import { findComponentById, findComponentPathById } from "../utils/components"
-import { pingEndUser } from "../api"
+import { API } from "api"
const dispatchEvent = (type, data = {}) => {
window.parent.postMessage({ type, data })
@@ -65,8 +65,12 @@ const createBuilderStore = () => {
notifyLoaded: () => {
dispatchEvent("preview-loaded")
},
- pingEndUser: () => {
- pingEndUser()
+ pingEndUser: async () => {
+ try {
+ await API.pingEndUser()
+ } catch (error) {
+ // Do nothing
+ }
},
setSelectedPath: path => {
writableStore.update(state => ({ ...state, selectedPath: path }))
diff --git a/packages/client/src/stores/context.js b/packages/client/src/stores/context.js
index fcbcf0f592..9c35bc4862 100644
--- a/packages/client/src/stores/context.js
+++ b/packages/client/src/stores/context.js
@@ -1,5 +1,5 @@
import { writable, derived } from "svelte/store"
-import { hashString } from "../utils/helpers"
+import { Helpers } from "@budibase/bbui"
export const createContextStore = oldContext => {
const newContext = writable({})
@@ -10,7 +10,9 @@ export const createContextStore = oldContext => {
for (let i = 0; i < $contexts.length - 1; i++) {
key += $contexts[i].key
}
- key = hashString(key + JSON.stringify($contexts[$contexts.length - 1]))
+ key = Helpers.hashString(
+ key + JSON.stringify($contexts[$contexts.length - 1])
+ )
// Reduce global state
const reducer = (total, context) => ({ ...total, ...context })
diff --git a/packages/client/src/stores/dataSource.js b/packages/client/src/stores/dataSource.js
index 46ac0b6c86..d5ad0cb594 100644
--- a/packages/client/src/stores/dataSource.js
+++ b/packages/client/src/stores/dataSource.js
@@ -1,5 +1,5 @@
import { writable, get } from "svelte/store"
-import { fetchTableDefinition } from "../api"
+import { API } from "api"
import { FieldTypes } from "../constants"
import { routeStore } from "./routes"
@@ -72,8 +72,14 @@ export const createDataSourceStore = () => {
let invalidations = [dataSourceId]
// Fetch related table IDs from table schema
- const definition = await fetchTableDefinition(dataSourceId)
- const schema = definition?.schema
+ let schema
+ try {
+ const definition = await API.fetchTableDefinition(dataSourceId)
+ schema = definition?.schema
+ } catch (error) {
+ schema = null
+ }
+
if (schema) {
Object.values(schema).forEach(fieldSchema => {
if (
diff --git a/packages/client/src/stores/notification.js b/packages/client/src/stores/notification.js
index 64178328c0..e12eccf210 100644
--- a/packages/client/src/stores/notification.js
+++ b/packages/client/src/stores/notification.js
@@ -19,7 +19,7 @@ const createNotificationStore = () => {
setTimeout(() => (block = false), timeout)
}
- const send = (message, type = "info", icon) => {
+ const send = (message, type = "info", icon, autoDismiss = true) => {
if (block) {
return
}
@@ -32,6 +32,7 @@ const createNotificationStore = () => {
message,
type,
icon,
+ autoDismiss,
},
})
return
@@ -42,12 +43,20 @@ const createNotificationStore = () => {
type,
message,
icon,
+ dismissable: !autoDismiss,
delay: get(store) != null,
})
clearTimeout(timeout)
- timeout = setTimeout(() => {
- store.set(null)
- }, NOTIFICATION_TIMEOUT)
+ if (autoDismiss) {
+ timeout = setTimeout(() => {
+ store.set(null)
+ }, NOTIFICATION_TIMEOUT)
+ }
+ }
+
+ const dismiss = () => {
+ clearTimeout(timeout)
+ store.set(null)
}
return {
@@ -57,8 +66,9 @@ const createNotificationStore = () => {
info: msg => send(msg, "info", "Info"),
success: msg => send(msg, "success", "CheckmarkCircle"),
warning: msg => send(msg, "warning", "Alert"),
- error: msg => send(msg, "error", "Alert"),
+ error: msg => send(msg, "error", "Alert", false),
blockNotifications,
+ dismiss,
},
}
}
diff --git a/packages/client/src/stores/routes.js b/packages/client/src/stores/routes.js
index d50677493b..69cd42d5f5 100644
--- a/packages/client/src/stores/routes.js
+++ b/packages/client/src/stores/routes.js
@@ -1,6 +1,6 @@
import { get, writable } from "svelte/store"
import { push } from "svelte-spa-router"
-import * as API from "../api"
+import { API } from "api"
import { peekStore } from "./peek"
import { builderStore } from "./builder"
@@ -16,9 +16,14 @@ const createRouteStore = () => {
const store = writable(initialState)
const fetchRoutes = async () => {
- const routeConfig = await API.fetchRoutes()
+ let routeConfig
+ try {
+ routeConfig = await API.fetchClientAppRoutes()
+ } catch (error) {
+ routeConfig = null
+ }
let routes = []
- Object.values(routeConfig.routes || {}).forEach(route => {
+ Object.values(routeConfig?.routes || {}).forEach(route => {
Object.entries(route.subpaths || {}).forEach(([path, config]) => {
routes.push({
path,
diff --git a/packages/client/src/stores/state.js b/packages/client/src/stores/state.js
index 0297b4c532..b757334526 100644
--- a/packages/client/src/stores/state.js
+++ b/packages/client/src/stores/state.js
@@ -1,10 +1,10 @@
import { writable, get, derived } from "svelte/store"
-import { localStorageStore } from "builder/src/builderStore/store/localStorage"
+import { createLocalStorageStore } from "@budibase/frontend-core"
const createStateStore = () => {
const appId = window["##BUDIBASE_APP_ID##"] || "app"
const localStorageKey = `${appId}.state`
- const persistentStore = localStorageStore(localStorageKey, {})
+ const persistentStore = createLocalStorageStore(localStorageKey, {})
// Initialise the temp store to mirror the persistent store
const tempStore = writable(get(persistentStore))
diff --git a/packages/client/src/utils/buttonActions.js b/packages/client/src/utils/buttonActions.js
index 560aaa59c4..72c8f9c083 100644
--- a/packages/client/src/utils/buttonActions.js
+++ b/packages/client/src/utils/buttonActions.js
@@ -5,12 +5,14 @@ import {
confirmationStore,
authStore,
stateStore,
+ notificationStore,
+ dataSourceStore,
uploadStore,
} from "stores"
-import { saveRow, deleteRow, executeQuery, triggerAutomation } from "api"
+import { API } from "api"
import { ActionTypes } from "constants"
import { enrichDataBindings } from "./enrichDataBinding"
-import { deepSet } from "@budibase/bbui"
+import { Helpers } from "@budibase/bbui"
const saveRowHandler = async (action, context) => {
const { fields, providerId, tableId } = action.parameters
@@ -22,15 +24,23 @@ const saveRowHandler = async (action, context) => {
}
if (fields) {
for (let [field, value] of Object.entries(fields)) {
- deepSet(payload, field, value)
+ Helpers.deepSet(payload, field, value)
}
}
if (tableId) {
payload.tableId = tableId
}
- const row = await saveRow(payload)
- return {
- row,
+ try {
+ const row = await API.saveRow(payload)
+ notificationStore.actions.success("Row saved")
+
+ // Refresh related datasources
+ await dataSourceStore.actions.invalidateDataSource(row.tableId)
+
+ return { row }
+ } catch (error) {
+ // Abort next actions
+ return false
}
}
@@ -40,7 +50,7 @@ const duplicateRowHandler = async (action, context) => {
let payload = { ...context[providerId] }
if (fields) {
for (let [field, value] of Object.entries(fields)) {
- deepSet(payload, field, value)
+ Helpers.deepSet(payload, field, value)
}
}
if (tableId) {
@@ -48,9 +58,17 @@ const duplicateRowHandler = async (action, context) => {
}
delete payload._id
delete payload._rev
- const row = await saveRow(payload)
- return {
- row,
+ try {
+ const row = await API.saveRow(payload)
+ notificationStore.actions.success("Row saved")
+
+ // Refresh related datasources
+ await dataSourceStore.actions.invalidateDataSource(row.tableId)
+
+ return { row }
+ } catch (error) {
+ // Abort next actions
+ return false
}
}
}
@@ -58,14 +76,32 @@ const duplicateRowHandler = async (action, context) => {
const deleteRowHandler = async action => {
const { tableId, revId, rowId } = action.parameters
if (tableId && revId && rowId) {
- await deleteRow({ tableId, rowId, revId })
+ try {
+ await API.deleteRow({ tableId, rowId, revId })
+ notificationStore.actions.success("Row deleted")
+
+ // Refresh related datasources
+ await dataSourceStore.actions.invalidateDataSource(tableId)
+ } catch (error) {
+ // Abort next actions
+ return false
+ }
}
}
const triggerAutomationHandler = async action => {
const { fields } = action.parameters
if (fields) {
- await triggerAutomation(action.parameters.automationId, fields)
+ try {
+ await API.triggerAutomation({
+ automationId: action.parameters.automationId,
+ fields,
+ })
+ notificationStore.actions.success("Automation triggered")
+ } catch (error) {
+ // Abort next actions
+ return false
+ }
}
}
@@ -76,12 +112,30 @@ const navigationHandler = action => {
const queryExecutionHandler = async action => {
const { datasourceId, queryId, queryParams } = action.parameters
- const result = await executeQuery({
- datasourceId,
- queryId,
- parameters: queryParams,
- })
- return { result }
+ try {
+ const query = await API.fetchQueryDefinition(queryId)
+ if (query?.datasourceId == null) {
+ notificationStore.actions.error("That query couldn't be found")
+ return false
+ }
+ const result = await API.executeQuery({
+ datasourceId,
+ queryId,
+ parameters: queryParams,
+ })
+
+ // Trigger a notification and invalidate the datasource as long as this
+ // was not a readable query
+ if (!query.readable) {
+ API.notifications.error.success("Query executed successfully")
+ await dataSourceStore.actions.invalidateDataSource(query.datasourceId)
+ }
+
+ return { result }
+ } catch (error) {
+ // Abort next actions
+ return false
+ }
}
const executeActionHandler = async (
diff --git a/packages/client/src/utils/conditions.js b/packages/client/src/utils/conditions.js
index 2791fa169e..1914e942ad 100644
--- a/packages/client/src/utils/conditions.js
+++ b/packages/client/src/utils/conditions.js
@@ -1,4 +1,4 @@
-import { buildLuceneQuery, luceneQuery } from "builder/src/helpers/lucene"
+import { LuceneUtils } from "@budibase/frontend-core"
export const getActiveConditions = conditions => {
if (!conditions?.length) {
@@ -33,8 +33,8 @@ export const getActiveConditions = conditions => {
value: condition.referenceValue,
}
- const query = buildLuceneQuery([luceneCondition])
- const result = luceneQuery([luceneCondition], query)
+ const query = LuceneUtils.buildLuceneQuery([luceneCondition])
+ const result = LuceneUtils.runLuceneQuery([luceneCondition], query)
return result.length > 0
})
}
diff --git a/packages/client/src/utils/fetch/JSONArrayFetch.js b/packages/client/src/utils/fetch/JSONArrayFetch.js
deleted file mode 100644
index 8beb555ef9..0000000000
--- a/packages/client/src/utils/fetch/JSONArrayFetch.js
+++ /dev/null
@@ -1,13 +0,0 @@
-import FieldFetch from "./FieldFetch.js"
-import { fetchTableDefinition } from "api"
-import { getJSONArrayDatasourceSchema } from "builder/src/builderStore/jsonUtils"
-
-export default class JSONArrayFetch extends FieldFetch {
- static async getDefinition(datasource) {
- // JSON arrays need their table definitions fetched.
- // We can then extract their schema as a subset of the table schema.
- const table = await fetchTableDefinition(datasource.tableId)
- const schema = getJSONArrayDatasourceSchema(table?.schema, datasource)
- return { schema }
- }
-}
diff --git a/packages/client/src/utils/fetch/QueryFetch.js b/packages/client/src/utils/fetch/QueryFetch.js
deleted file mode 100644
index 76aca2a855..0000000000
--- a/packages/client/src/utils/fetch/QueryFetch.js
+++ /dev/null
@@ -1,73 +0,0 @@
-import DataFetch from "./DataFetch.js"
-import { executeQuery, fetchQueryDefinition } from "api"
-import { cloneDeep } from "lodash/fp"
-import { get } from "svelte/store"
-
-export default class QueryFetch extends DataFetch {
- determineFeatureFlags(definition) {
- const supportsPagination =
- !!definition?.fields?.pagination?.type &&
- !!definition?.fields?.pagination?.location &&
- !!definition?.fields?.pagination?.pageParam
- return { supportsPagination }
- }
-
- static async getDefinition(datasource) {
- if (!datasource?._id) {
- return null
- }
- const definition = await fetchQueryDefinition(datasource._id)
- // After getting the definition of query, it loses "fields" attribute because of security reason from the server. However, this attribute needs to be inside of defintion for pagination.
- if (!definition.fields) {
- definition.fields = datasource.fields
- }
- return definition
- }
-
- async getData() {
- const { datasource, limit, paginate } = this.options
- const { supportsPagination } = get(this.featureStore)
- const { cursor, definition } = get(this.store)
- const type = definition?.fields?.pagination?.type
-
- // Set the default query params
- let parameters = cloneDeep(datasource?.queryParams || {})
- for (let param of datasource?.parameters || {}) {
- if (!parameters[param.name]) {
- parameters[param.name] = param.default
- }
- }
-
- // Add pagination to query if supported
- let queryPayload = { queryId: datasource?._id, parameters }
- if (paginate && supportsPagination) {
- const requestCursor = type === "page" ? parseInt(cursor || 1) : cursor
- queryPayload.pagination = { page: requestCursor, limit }
- }
-
- // Execute query
- const { data, pagination, ...rest } = await executeQuery(queryPayload)
-
- // Derive pagination info from response
- let nextCursor = null
- let hasNextPage = false
- if (paginate && supportsPagination) {
- if (type === "page") {
- // For "page number" pagination, increment the existing page number
- nextCursor = queryPayload.pagination.page + 1
- hasNextPage = data?.length === limit && limit > 0
- } else {
- // For "cursor" pagination, the cursor should be in the response
- nextCursor = pagination?.cursor
- hasNextPage = nextCursor != null
- }
- }
-
- return {
- rows: data || [],
- info: rest,
- cursor: nextCursor,
- hasNextPage,
- }
- }
-}
diff --git a/packages/client/src/utils/fetch/RelationshipFetch.js b/packages/client/src/utils/fetch/RelationshipFetch.js
deleted file mode 100644
index b8adf686f1..0000000000
--- a/packages/client/src/utils/fetch/RelationshipFetch.js
+++ /dev/null
@@ -1,16 +0,0 @@
-import DataFetch from "./DataFetch.js"
-import { fetchRelationshipData } from "api"
-
-export default class RelationshipFetch extends DataFetch {
- async getData() {
- const { datasource } = this.options
- const res = await fetchRelationshipData({
- rowId: datasource?.rowId,
- tableId: datasource?.rowTableId,
- fieldName: datasource?.fieldName,
- })
- return {
- rows: res || [],
- }
- }
-}
diff --git a/packages/client/src/utils/helpers.js b/packages/client/src/utils/helpers.js
deleted file mode 100644
index 1df4aee85e..0000000000
--- a/packages/client/src/utils/helpers.js
+++ /dev/null
@@ -1,57 +0,0 @@
-/**
- * Capitalises a string.
- *
- * @param string
- * @returns {string}
- */
-export const capitalise = string => {
- return string.substring(0, 1).toUpperCase() + string.substring(1)
-}
-
-/**
- * Generates a short random ID.
- * This is "nanoid" but rollup was derping attempting to bundle it, so the
- * source has just been extracted manually since it's tiny.
- */
-export const generateID = (size = 21) => {
- let id = ""
- let bytes = crypto.getRandomValues(new Uint8Array(size))
-
- // A compact alternative for `for (var i = 0; i < step; i++)`.
- while (size--) {
- // It is incorrect to use bytes exceeding the alphabet size.
- // The following mask reduces the random byte in the 0-255 value
- // range to the 0-63 value range. Therefore, adding hacks, such
- // as empty string fallback or magic numbers, is unnecessary because
- // the bitmask trims bytes down to the alphabet size.
- let byte = bytes[size] & 63
- if (byte < 36) {
- // `0-9a-z`
- id += byte.toString(36)
- } else if (byte < 62) {
- // `A-Z`
- id += (byte - 26).toString(36).toUpperCase()
- } else if (byte < 63) {
- id += "_"
- } else {
- id += "-"
- }
- }
- return id
-}
-
-/**
- * Computes a short hash of a string
- */
-export const hashString = str => {
- if (!str) {
- return 0
- }
- let hash = 0
- for (let i = 0; i < str.length; i++) {
- let char = str.charCodeAt(i)
- hash = (hash << 5) - hash + char
- hash = hash & hash // Convert to 32bit integer
- }
- return hash
-}
diff --git a/packages/client/src/utils/schema.js b/packages/client/src/utils/schema.js
index 4189aa7f2b..fba25e384a 100644
--- a/packages/client/src/utils/schema.js
+++ b/packages/client/src/utils/schema.js
@@ -1,12 +1,12 @@
-import { convertJSONSchemaToTableSchema } from "builder/src/builderStore/jsonUtils"
-import TableFetch from "./fetch/TableFetch.js"
-import ViewFetch from "./fetch/ViewFetch.js"
-import QueryFetch from "./fetch/QueryFetch.js"
-import RelationshipFetch from "./fetch/RelationshipFetch.js"
-import NestedProviderFetch from "./fetch/NestedProviderFetch.js"
-import FieldFetch from "./fetch/FieldFetch.js"
-import JSONArrayFetch from "./fetch/JSONArrayFetch.js"
-import DataFetch from "./fetch/DataFetch.js"
+import { API } from "api"
+import { JSONUtils } from "@budibase/frontend-core"
+import TableFetch from "@budibase/frontend-core/src/fetch/TableFetch.js"
+import ViewFetch from "@budibase/frontend-core/src/fetch/ViewFetch.js"
+import QueryFetch from "@budibase/frontend-core/src/fetch/QueryFetch.js"
+import RelationshipFetch from "@budibase/frontend-core/src/fetch/RelationshipFetch.js"
+import NestedProviderFetch from "@budibase/frontend-core/src/fetch/NestedProviderFetch.js"
+import FieldFetch from "@budibase/frontend-core/src/fetch/FieldFetch.js"
+import JSONArrayFetch from "@budibase/frontend-core/src/fetch/JSONArrayFetch.js"
/**
* Fetches the schema of any kind of datasource.
@@ -31,10 +31,11 @@ export const fetchDatasourceSchema = async (
if (!handler) {
return null
}
+ const instance = new handler({ API })
// Get the datasource definition and then schema
- const definition = await handler.getDefinition(datasource)
- let schema = handler.getSchema(datasource, definition)
+ const definition = await instance.getDefinition(datasource)
+ let schema = instance.getSchema(datasource, definition)
if (!schema) {
return null
}
@@ -44,7 +45,7 @@ export const fetchDatasourceSchema = async (
Object.keys(schema).forEach(fieldKey => {
const fieldSchema = schema[fieldKey]
if (fieldSchema?.type === "json") {
- const jsonSchema = convertJSONSchemaToTableSchema(fieldSchema, {
+ const jsonSchema = JSONUtils.convertJSONSchemaToTableSchema(fieldSchema, {
squashObjects: true,
})
Object.keys(jsonSchema).forEach(jsonKey => {
@@ -78,5 +79,5 @@ export const fetchDatasourceSchema = async (
}
// Ensure schema structure is correct
- return DataFetch.enrichSchema(schema)
+ return instance.enrichSchema(schema)
}
diff --git a/packages/client/yarn.lock b/packages/client/yarn.lock
index 7a6c780a87..308ccc8808 100644
--- a/packages/client/yarn.lock
+++ b/packages/client/yarn.lock
@@ -2,11 +2,6 @@
# yarn lockfile v1
-"@adobe/spectrum-css-workflow-icons@^1.2.1":
- version "1.2.1"
- resolved "https://registry.yarnpkg.com/@adobe/spectrum-css-workflow-icons/-/spectrum-css-workflow-icons-1.2.1.tgz#7e2cb3fcfb5c8b12d7275afafbb6ec44913551b4"
- integrity sha512-uVgekyBXnOVkxp+CUssjN/gefARtudZC8duEn1vm0lBQFwGRZFlDEzU1QC+aIRWCrD1Z8OgRpmBYlSZ7QS003w==
-
"@babel/code-frame@^7.10.4":
version "7.16.0"
resolved "https://registry.yarnpkg.com/@babel/code-frame/-/code-frame-7.16.0.tgz#0dfc80309beec8411e65e706461c408b0bb9b431"
@@ -28,73 +23,6 @@
chalk "^2.0.0"
js-tokens "^4.0.0"
-"@budibase/bbui@^0.9.139":
- version "0.9.187"
- resolved "https://registry.yarnpkg.com/@budibase/bbui/-/bbui-0.9.187.tgz#84f0a37301cfa41f50eaa335243ac08923d9e34f"
- integrity sha512-Qy24x99NloRAoG78NMdzoJuX3Gbf+eZdHeYTAeUchljB4o2W2L/Ous8qYBzqigYtVcChjzteSTAZ2jCLq458Vg==
- dependencies:
- "@adobe/spectrum-css-workflow-icons" "^1.2.1"
- "@spectrum-css/actionbutton" "^1.0.1"
- "@spectrum-css/actiongroup" "^1.0.1"
- "@spectrum-css/avatar" "^3.0.2"
- "@spectrum-css/button" "^3.0.1"
- "@spectrum-css/buttongroup" "^3.0.2"
- "@spectrum-css/checkbox" "^3.0.2"
- "@spectrum-css/dialog" "^3.0.1"
- "@spectrum-css/divider" "^1.0.3"
- "@spectrum-css/dropzone" "^3.0.2"
- "@spectrum-css/fieldgroup" "^3.0.2"
- "@spectrum-css/fieldlabel" "^3.0.1"
- "@spectrum-css/icon" "^3.0.1"
- "@spectrum-css/illustratedmessage" "^3.0.2"
- "@spectrum-css/inputgroup" "^3.0.2"
- "@spectrum-css/label" "^2.0.10"
- "@spectrum-css/link" "^3.1.1"
- "@spectrum-css/menu" "^3.0.1"
- "@spectrum-css/modal" "^3.0.1"
- "@spectrum-css/pagination" "^3.0.3"
- "@spectrum-css/picker" "^1.0.1"
- "@spectrum-css/popover" "^3.0.1"
- "@spectrum-css/progressbar" "^1.0.2"
- "@spectrum-css/progresscircle" "^1.0.2"
- "@spectrum-css/radio" "^3.0.2"
- "@spectrum-css/search" "^3.0.2"
- "@spectrum-css/sidenav" "^3.0.2"
- "@spectrum-css/statuslight" "^3.0.2"
- "@spectrum-css/stepper" "^3.0.3"
- "@spectrum-css/switch" "^1.0.2"
- "@spectrum-css/table" "^3.0.1"
- "@spectrum-css/tabs" "^3.0.1"
- "@spectrum-css/tags" "^3.0.2"
- "@spectrum-css/textfield" "^3.0.1"
- "@spectrum-css/toast" "^3.0.1"
- "@spectrum-css/tooltip" "^3.0.3"
- "@spectrum-css/treeview" "^3.0.2"
- "@spectrum-css/typography" "^3.0.1"
- "@spectrum-css/underlay" "^2.0.9"
- "@spectrum-css/vars" "^3.0.1"
- dayjs "^1.10.4"
- svelte-flatpickr "^3.2.3"
- svelte-portal "^1.0.0"
-
-"@budibase/standard-components@^0.9.139":
- version "0.9.139"
- resolved "https://registry.yarnpkg.com/@budibase/standard-components/-/standard-components-0.9.139.tgz#cf8e2b759ae863e469e50272b3ca87f2827e66e3"
- integrity sha512-Av0u9Eq2jerjhG6Atta+c0mOQGgE5K0QI3cm+8s/3Vki6/PXkO1YL5Alo3BOn9ayQAVZ/xp4rtZPuN/rzRibHw==
- dependencies:
- "@budibase/bbui" "^0.9.139"
- "@spectrum-css/button" "^3.0.3"
- "@spectrum-css/card" "^3.0.3"
- "@spectrum-css/divider" "^1.0.3"
- "@spectrum-css/link" "^3.1.3"
- "@spectrum-css/page" "^3.0.1"
- "@spectrum-css/typography" "^3.0.2"
- "@spectrum-css/vars" "^3.0.1"
- apexcharts "^3.22.1"
- dayjs "^1.10.5"
- svelte-apexcharts "^1.0.2"
- svelte-flatpickr "^3.1.0"
-
"@rollup/plugin-alias@^3.1.5":
version "3.1.8"
resolved "https://registry.yarnpkg.com/@rollup/plugin-alias/-/plugin-alias-3.1.8.tgz#645fd84659e08d3d1b059408fcdf69c1dd435a6b"
@@ -145,46 +73,16 @@
estree-walker "^1.0.1"
picomatch "^2.2.2"
-"@spectrum-css/actionbutton@^1.0.1":
- version "1.0.9"
- resolved "https://registry.yarnpkg.com/@spectrum-css/actionbutton/-/actionbutton-1.0.9.tgz#b4fe75f9f4327264b5903ed4a0c4d50da4f6da3f"
- integrity sha512-wDRJWbWrPTOJZTXCqwpUUfFYXfQwGjkd0bQdvZMUdtBbMnck7yBuWFAL0T5JPQUN9LjFcUyAxGRiwkjoDtwtqQ==
-
-"@spectrum-css/actiongroup@^1.0.1":
- version "1.0.9"
- resolved "https://registry.yarnpkg.com/@spectrum-css/actiongroup/-/actiongroup-1.0.9.tgz#f0c3b2f1ecca11517a47c860904db97f35f1afd4"
- integrity sha512-HGsVvWuDV2et0Z9VZhgeOYD3DH5U80kx9L/rrIGGi25mderyfDHGJZfAQfJg8fptTsXhq3Nv6u4DUaUJlproCw==
-
-"@spectrum-css/avatar@^3.0.2":
- version "3.0.2"
- resolved "https://registry.yarnpkg.com/@spectrum-css/avatar/-/avatar-3.0.2.tgz#4f1826223eae330e64b6d3cc899e9bc2e98dac95"
- integrity sha512-wEczvSqxttTWSiL3cOvXV/RmGRwSkw2w6+slcHhnf0kb7ovymMM+9oz8vvEpEsSeo5u598bc+7ktrKFpAd6soQ==
-
-"@spectrum-css/button@^3.0.1", "@spectrum-css/button@^3.0.3":
+"@spectrum-css/button@^3.0.3":
version "3.0.3"
resolved "https://registry.yarnpkg.com/@spectrum-css/button/-/button-3.0.3.tgz#2df1efaab6c7e0b3b06cb4b59e1eae59c7f1fc84"
integrity sha512-6CnLPqqtaU/PcSSIGeGRi0iFIIxIUByYLKFO6zn5NEUc12KQ28dJ4PLwB6WBa0L8vRoAGlnWWH2ZZweTijbXgg==
-"@spectrum-css/buttongroup@^3.0.2":
- version "3.0.9"
- resolved "https://registry.yarnpkg.com/@spectrum-css/buttongroup/-/buttongroup-3.0.9.tgz#62d6f04994846f9dd341f811e101d372fc200ae0"
- integrity sha512-OsvTl5x6vfRSy8kYGZMzdp1ksUyKeySBa5QWSUl/g1j6XztDeB8Lxapc+VUV6TmrB9qhWglZNVodlIL2jsiWKw==
-
"@spectrum-css/card@^3.0.3":
version "3.0.3"
resolved "https://registry.yarnpkg.com/@spectrum-css/card/-/card-3.0.3.tgz#56b2e2da6b80c1583228baa279de7407383bfb6b"
integrity sha512-+oKLUI2a0QmQP9EzySeq/G4FpUkkdaDNbuEbqCj2IkPMc/2v/nwzsPhh1fj2UIghGAiiUwXfPpzax1e8fyhQUg==
-"@spectrum-css/checkbox@^3.0.2":
- version "3.0.9"
- resolved "https://registry.yarnpkg.com/@spectrum-css/checkbox/-/checkbox-3.0.9.tgz#7fef1887912289f7c86e181dca24108e6be7d90e"
- integrity sha512-lSlkvNHwgSX+JNzzJ5OgZ/a7KtikcthJi2YWcZb567RH8g0+SQ6ooYBRVDFvB6aH6eypiX/7n6Nh03yYT3w+Ug==
-
-"@spectrum-css/dialog@^3.0.1":
- version "3.0.10"
- resolved "https://registry.yarnpkg.com/@spectrum-css/dialog/-/dialog-3.0.10.tgz#f70ed878385bce6bf39c106b31abed20a13e8ccf"
- integrity sha512-8Rq4tTz+CUdVGm2B4ifsw87EOraFvuaZiJwuMyPe0XW2VZ0LiAaiIBJq8s+NCMMwlGxkvEC0HFjWphC3nSj64A==
-
"@spectrum-css/divider@^1.0.3":
version "1.0.9"
resolved "https://registry.yarnpkg.com/@spectrum-css/divider/-/divider-1.0.9.tgz#00246bd453981c4696149d26f5bcfeefd29b4b53"
@@ -192,56 +90,11 @@
dependencies:
"@spectrum-css/vars" "^4.3.0"
-"@spectrum-css/dropzone@^3.0.2":
- version "3.0.9"
- resolved "https://registry.yarnpkg.com/@spectrum-css/dropzone/-/dropzone-3.0.9.tgz#7dc39302fbebeb16b28c0b5070de86f33757f752"
- integrity sha512-VQBCHvrt3vShgch1DTgc9ls3hx5Tre30dLxdeDFWtflyBkDrEhzwNblpL1+fB70suqDK6eanOrcnuC1VVp7gkA==
-
-"@spectrum-css/fieldgroup@^3.0.2":
- version "3.0.9"
- resolved "https://registry.yarnpkg.com/@spectrum-css/fieldgroup/-/fieldgroup-3.0.9.tgz#067310ab302b5777ceb7712511405043edf2a6ec"
- integrity sha512-y8CRNyUfi0mKDrPTbAjkea4hQp9WxZkkECUO3Ca8BrA5TN/nuQ91FyZzZ0fn68XACvi41nDLMhYp/e8dE5Ahtw==
-
-"@spectrum-css/fieldlabel@^3.0.1":
- version "3.0.3"
- resolved "https://registry.yarnpkg.com/@spectrum-css/fieldlabel/-/fieldlabel-3.0.3.tgz#f73c04d20734d4718ffb620dc46458904685b449"
- integrity sha512-nEvIkEXCD5n4fW67Unq6Iu7VXoauEd/JGpfTY02VsC5p4FJLnwKfPDbJUuUsqClAxqw7nAsmXVKtn4zQFf5yPQ==
-
-"@spectrum-css/icon@^3.0.1":
- version "3.0.9"
- resolved "https://registry.yarnpkg.com/@spectrum-css/icon/-/icon-3.0.9.tgz#6608239fb32a5c622c6c04d61fc185797e8e0410"
- integrity sha512-aX7B5+XOl4ObkJVcyyUCIbeFSdSXVAuyRQbLMzHetv85yihJX3D91jt1thkQVG2wS47jSl76QMq3WQ8DZxbQ+A==
-
-"@spectrum-css/illustratedmessage@^3.0.2":
- version "3.0.8"
- resolved "https://registry.yarnpkg.com/@spectrum-css/illustratedmessage/-/illustratedmessage-3.0.8.tgz#69ef0c935bcc5027f233a78de5aeb0064bf033cb"
- integrity sha512-HvC4dywDi11GdrXQDCvKQ0vFlrXLTyJuc9UKf7meQLCGoJbGYDBwe+tHXNK1c6gPMD9BoL6pPMP1K/vRzR4EBQ==
-
-"@spectrum-css/inputgroup@^3.0.2":
- version "3.0.8"
- resolved "https://registry.yarnpkg.com/@spectrum-css/inputgroup/-/inputgroup-3.0.8.tgz#fc23afc8a73c24d17249c9d2337e8b42085b298b"
- integrity sha512-cmQWzFp0GU+4IMc8SSeVFdmQDlRUdPelXaQdKUR9mZuO2iYettg37s0lfBCeJyYkUNTagz0zP8O7A0iXfmeE6g==
-
-"@spectrum-css/label@^2.0.10":
- version "2.0.10"
- resolved "https://registry.yarnpkg.com/@spectrum-css/label/-/label-2.0.10.tgz#2368651d7636a19385b5d300cdf6272db1916001"
- integrity sha512-xCbtEiQkZIlLdWFikuw7ifDCC21DOC/KMgVrrVJHXFc4KRQe9LTZSqmGF3tovm+CSq1adE59mYoTbojVQ9YuEQ==
-
-"@spectrum-css/link@^3.1.1", "@spectrum-css/link@^3.1.3":
+"@spectrum-css/link@^3.1.3":
version "3.1.9"
resolved "https://registry.yarnpkg.com/@spectrum-css/link/-/link-3.1.9.tgz#fe40db561c98bf2987489541ef39dcc71416908f"
integrity sha512-/DpmLIbQGDBNZl+Fnf5VDQ34uC6E6Bz393CAYkzYFyadtvzVEy+PGCgUkT3Tgrwu833IW9fZOh7rkKjw1o/Zng==
-"@spectrum-css/menu@^3.0.1":
- version "3.0.9"
- resolved "https://registry.yarnpkg.com/@spectrum-css/menu/-/menu-3.0.9.tgz#f1f5e7b715fa979701f535545628d44faca815e0"
- integrity sha512-vEXdpfzmoYYyA/ShReqc2+aG5BGCFwOybpJSzDIPfWTNIk/1IyjCycJo4+sRIE1CXS1Z7mP+PnJa+8EjXqnYGw==
-
-"@spectrum-css/modal@^3.0.1":
- version "3.0.8"
- resolved "https://registry.yarnpkg.com/@spectrum-css/modal/-/modal-3.0.8.tgz#b1bb62bd10e1b2c37bef447e72e9ada34b974321"
- integrity sha512-wJsTKp3ApCVOUdASbjxuxt3ngqFo31S0sDeOYTE752eckB+fYnUOzDfm5bGvBjhsgAMqmXwlnj/4kRjfVSRN8A==
-
"@spectrum-css/page@^3.0.1":
version "3.0.8"
resolved "https://registry.yarnpkg.com/@spectrum-css/page/-/page-3.0.8.tgz#001efa9e4c10095df9b2b37cf7d7d6eb60140190"
@@ -249,111 +102,16 @@
dependencies:
"@spectrum-css/vars" "^4.3.0"
-"@spectrum-css/pagination@^3.0.3":
- version "3.0.9"
- resolved "https://registry.yarnpkg.com/@spectrum-css/pagination/-/pagination-3.0.9.tgz#272f344ba0c38eae020a9a1a04c8a6d95fab29a2"
- integrity sha512-u3AEHAzXBFp6yvQij8nfrLdmwxE8N1eJdJlvaPNA4epKv/+qQEFDZ/2/RJAcA24sRBbDNWwN7TxcNayS+cQ1ag==
-
-"@spectrum-css/picker@^1.0.1":
- version "1.1.3"
- resolved "https://registry.yarnpkg.com/@spectrum-css/picker/-/picker-1.1.3.tgz#0dbe04801e04ebead9630e66f6864bf2458d38ea"
- integrity sha512-Ln4FyYhiE+2G7pJIlD0W8vqCqc1fi3j4m4YwdJzNdjG3gnwScolBwm8LRXNOnMFGcnedB0xtxYAxg54gDZi6bA==
-
-"@spectrum-css/popover@^3.0.1":
- version "3.0.9"
- resolved "https://registry.yarnpkg.com/@spectrum-css/popover/-/popover-3.0.9.tgz#256b396d939cacb8d3a285980bbe7c54c2b35606"
- integrity sha512-7JcjWkhIgPRhMCAvS2sELIDjgdFgEZn7PrKgudmpgvlFk19AlWvO/55RIWSvwQnX5xHQG29S8Vi1LZ9X/oBAiQ==
-
-"@spectrum-css/progressbar@^1.0.2":
- version "1.0.9"
- resolved "https://registry.yarnpkg.com/@spectrum-css/progressbar/-/progressbar-1.0.9.tgz#3988c9a74fad9639c9756c7cd8248ae2deecbe73"
- integrity sha512-1mT8PT2pjUbxY/fj5/a/FQFiSswju3dYo0RwVFVweD6SLsJl7VUbjskBYObnF6pOlq/pBIfvfWFZIaIEJVWSLA==
-
-"@spectrum-css/progresscircle@^1.0.2":
- version "1.0.8"
- resolved "https://registry.yarnpkg.com/@spectrum-css/progresscircle/-/progresscircle-1.0.8.tgz#f254e225829c011bb79d2c303beac58bbea51efd"
- integrity sha512-5/uSO/T1Vggb5soAlYiaUdP9uaNuqEgRhpiHjyFg9EFQIfgbDFIq68aV91GNQzmZNOJgFORvv0cSpvn9z/HCWA==
-
-"@spectrum-css/radio@^3.0.2":
- version "3.0.9"
- resolved "https://registry.yarnpkg.com/@spectrum-css/radio/-/radio-3.0.9.tgz#915752e96e83b647bd19e3c8419ff328b23048f3"
- integrity sha512-eZmwC6o/H8Zu/rcbSIVpQLC/B4XRqdltH1GRBcjPcTue5Q0yeCeUZLKdSfsNimEE+8Kz8C334I1d1vxmNGcnAg==
-
-"@spectrum-css/search@^3.0.2":
- version "3.1.2"
- resolved "https://registry.yarnpkg.com/@spectrum-css/search/-/search-3.1.2.tgz#8d43f35f884f7c190e7694c8d26a3f2cfed01ef0"
- integrity sha512-8cMK1QB07dbReZ/ECyTyoT2dELZ7hK1b3jEDiWSeLBbXcKirR1OI24sZEnewQY/XWFd/62Z1YdNaaA8S6UuXWQ==
-
-"@spectrum-css/sidenav@^3.0.2":
- version "3.0.9"
- resolved "https://registry.yarnpkg.com/@spectrum-css/sidenav/-/sidenav-3.0.9.tgz#494d62fd0c83a32362db8c62f75b673d60974b1f"
- integrity sha512-WkuCtbiwWgPelJZSGgS9zJwC6/EZPrOZR+RqAdEeIRbjkLOYmdFJl1PCCUpRTHFBaondceIceFI1smZLRofxNg==
-
-"@spectrum-css/statuslight@^3.0.2":
- version "3.0.8"
- resolved "https://registry.yarnpkg.com/@spectrum-css/statuslight/-/statuslight-3.0.8.tgz#3b0ea80712573679870a85d469850230e794a0f7"
- integrity sha512-zMTHs8lk+I7fLdi9waEEbsCmJ1FxeHcjQ0yltWxuRmGk2vl4MQdQIuHIMI63iblqEaiwnJRjXJoKnWlNvndTJQ==
-
-"@spectrum-css/stepper@^3.0.3":
- version "3.0.9"
- resolved "https://registry.yarnpkg.com/@spectrum-css/stepper/-/stepper-3.0.9.tgz#6b2df8fbfb181224b95246fb4cd12de9ff67802a"
- integrity sha512-w0Ksfd8BTgMgt1lD+ng6/51Hj6J7oJ1d+KbT+HX9bjVNXJN84VrYU1P63vSG3V0p8bbtVOGNPjRFJb98nP2CWg==
-
-"@spectrum-css/switch@^1.0.2":
- version "1.0.8"
- resolved "https://registry.yarnpkg.com/@spectrum-css/switch/-/switch-1.0.8.tgz#449841596a9093f9205ba835353cbd5f7932e3e7"
- integrity sha512-tV5sX+C9hMMIxWMLZnAbXbRDIfOb3BBj9CB52o3ocEExBLv7o6SlekiZLVmYCCDrOJVrztRV3fwqLoPV3VMMuw==
-
-"@spectrum-css/table@^3.0.1":
- version "3.0.3"
- resolved "https://registry.yarnpkg.com/@spectrum-css/table/-/table-3.0.3.tgz#7f7f19905ef3275cbf907ce3a5818e63c30b2caf"
- integrity sha512-nxwzVjLPsXoY/v4sdxOVYLcC+cEbGgJyLcLclT5LT9MGSbngFeUMJzzVR4EvehzuN4dH7hrATG7Mbuq29Mf0Hg==
-
-"@spectrum-css/tabs@^3.0.1":
- version "3.1.5"
- resolved "https://registry.yarnpkg.com/@spectrum-css/tabs/-/tabs-3.1.5.tgz#cc82e69c1fc721902345178231fb95d05938b983"
- integrity sha512-UtfW8bA1quYnJM6v/lp6AVYGnQFkiUix2FHAf/4VHVrk4mh7ydtLiXS0IR3Kx+t/S8FWdSdSQHDZ8tHbY1ZLZg==
-
"@spectrum-css/tag@^3.1.4":
version "3.1.4"
resolved "https://registry.yarnpkg.com/@spectrum-css/tag/-/tag-3.1.4.tgz#334384dd789ddf0562679cae62ef763883480ac5"
integrity sha512-9dYBMhCEkjy+p75XJIfCA2/zU4JAqsJrL7fkYIDXakS6/BzeVtIvAW/6JaIHtLIA9lrj0Sn4m+ZjceKnZNIv1w==
-"@spectrum-css/tags@^3.0.2":
- version "3.0.3"
- resolved "https://registry.yarnpkg.com/@spectrum-css/tags/-/tags-3.0.3.tgz#fc76d2735cdc442de91b7eb3bee49a928c0767ac"
- integrity sha512-SL8vPxVDfWcY5VdIuyl0TImEXcOU1I7yCyXkk7MudMwfnYs81FaIyY32hFV9OHj0Tz/36UzRzc7AVMSuRQ53pw==
-
-"@spectrum-css/textfield@^3.0.1":
- version "3.1.0"
- resolved "https://registry.yarnpkg.com/@spectrum-css/textfield/-/textfield-3.1.0.tgz#4268bf200e589d5bcfc88d9734c36dacc3a9e62b"
- integrity sha512-QMDkq/q2Is0YI3s6jxYyURQ7JlSCduEYX9kh2YDedxJBqwZ1IMDBBH9Pr2iYm4dbN6dLAe1ZgDlcD/BAMnnQEA==
-
-"@spectrum-css/toast@^3.0.1":
- version "3.0.3"
- resolved "https://registry.yarnpkg.com/@spectrum-css/toast/-/toast-3.0.3.tgz#97c1527384707600832ecda35643ed304615250f"
- integrity sha512-CjLeaMs+cjUXojCCRtbj0YkD2BoZW16kjj2o5omkEpUTjA34IJ8xJ1a+CCtDILWekhXvN0MBN4sbumcnwcnx8w==
-
-"@spectrum-css/tooltip@^3.0.3":
- version "3.1.3"
- resolved "https://registry.yarnpkg.com/@spectrum-css/tooltip/-/tooltip-3.1.3.tgz#88d1f5b2141ea729fe9e4a99de1ea6ce8b028cfb"
- integrity sha512-BIOCE1gM74MzVPgSleI/5nGOl1SiNDD15by48FY1fD/PaeeCamzFhRBkOaj48Htc+n+WimhsJnbxEfjPw9+8Sg==
-
-"@spectrum-css/treeview@^3.0.2":
- version "3.0.3"
- resolved "https://registry.yarnpkg.com/@spectrum-css/treeview/-/treeview-3.0.3.tgz#aeda5175158b9f8d7529cb2b394428eb2a428046"
- integrity sha512-D5gGzZC/KtRArdx86Mesc9+99W9nTbUOeyYGqoJoAfJSOttoT6Tk5CrDvlCmAqjKf5rajemAkGri1ChqvUIwkw==
-
-"@spectrum-css/typography@^3.0.1", "@spectrum-css/typography@^3.0.2":
+"@spectrum-css/typography@^3.0.2":
version "3.0.2"
resolved "https://registry.yarnpkg.com/@spectrum-css/typography/-/typography-3.0.2.tgz#ea3ca0a60e18064527819d48c8c4364cab4fcd38"
integrity sha512-5ZOLmQe0edzsDMyhghUd4hBb5uxGsFrxzf+WasfcUw9klSfTsRZ09n1BsaaWbgrLjlMQ+EEHS46v5VNo0Ms2CA==
-"@spectrum-css/underlay@^2.0.9":
- version "2.0.17"
- resolved "https://registry.yarnpkg.com/@spectrum-css/underlay/-/underlay-2.0.17.tgz#a1a9b71d4714563ed016906be90e68f9b8809302"
- integrity sha512-Afqhc7k8HBqMZ8jkpvl1MqeWRzwrXcdFFkMHiTNPNaJrCYNETyVRlQvvZMNftXOxrzbg+49Ux6FUCFLINnwGwQ==
-
"@spectrum-css/vars@^3.0.1":
version "3.0.2"
resolved "https://registry.yarnpkg.com/@spectrum-css/vars/-/vars-3.0.2.tgz#ea9062c3c98dfc6ba59e5df14a03025ad8969999"
@@ -725,7 +483,7 @@ data-urls@^2.0.0:
whatwg-mimetype "^2.3.0"
whatwg-url "^8.0.0"
-dayjs@^1.10.4, dayjs@^1.10.5:
+dayjs@^1.10.5:
version "1.10.7"
resolved "https://registry.yarnpkg.com/dayjs/-/dayjs-1.10.7.tgz#2cf5f91add28116748440866a0a1d26f3a6ce468"
integrity sha512-P6twpd70BcPK34K26uJ1KT3wlhpuOAPoMwJzpsIWUxHZ7wpmbdZL/hQqBDfz7hGurYSa5PhzdhDHtt319hL3ig==
@@ -1832,18 +1590,13 @@ svelte-apexcharts@^1.0.2:
dependencies:
apexcharts "^3.19.2"
-svelte-flatpickr@^3.1.0, svelte-flatpickr@^3.2.3:
+svelte-flatpickr@^3.1.0:
version "3.2.4"
resolved "https://registry.yarnpkg.com/svelte-flatpickr/-/svelte-flatpickr-3.2.4.tgz#1824e26a5dc151d14906cfc7dfd100aefd1b072d"
integrity sha512-EE2wbFfpZ3iCBOXRRW52w436Jv5lqFoJkd/1vB8XmkfASJgF9HrrZ6Er11NWSmmpaV1nPywwDYFXdWHCB+Wi9Q==
dependencies:
flatpickr "^4.5.2"
-svelte-portal@^1.0.0:
- version "1.0.0"
- resolved "https://registry.yarnpkg.com/svelte-portal/-/svelte-portal-1.0.0.tgz#36a47c5578b1a4d9b4dc60fa32a904640ec4cdd3"
- integrity sha512-nHf+DS/jZ6jjnZSleBMSaZua9JlG5rZv9lOGKgJuaZStfevtjIlUJrkLc3vbV8QdBvPPVmvcjTlazAzfKu0v3Q==
-
svelte-spa-router@^3.0.5:
version "3.2.0"
resolved "https://registry.yarnpkg.com/svelte-spa-router/-/svelte-spa-router-3.2.0.tgz#fae3311d292451236cb57131262406cf312b15ee"
diff --git a/packages/frontend-core/.gitignore b/packages/frontend-core/.gitignore
new file mode 100644
index 0000000000..1947eba17b
--- /dev/null
+++ b/packages/frontend-core/.gitignore
@@ -0,0 +1,5 @@
+.DS_Store
+node_modules
+package-lock.json
+release/
+dist/
\ No newline at end of file
diff --git a/packages/builder/assets/error.svg b/packages/frontend-core/assets/error.svg
similarity index 100%
rename from packages/builder/assets/error.svg
rename to packages/frontend-core/assets/error.svg
diff --git a/packages/frontend-core/package.json b/packages/frontend-core/package.json
new file mode 100644
index 0000000000..62eeb4b8b1
--- /dev/null
+++ b/packages/frontend-core/package.json
@@ -0,0 +1,13 @@
+{
+ "name": "@budibase/frontend-core",
+ "version": "1.0.50-alpha.6",
+ "description": "Budibase frontend core libraries used in builder and client",
+ "author": "Budibase",
+ "license": "MPL-2.0",
+ "svelte": "src/index.js",
+ "dependencies": {
+ "@budibase/bbui": "^1.0.50-alpha.6",
+ "lodash": "^4.17.21",
+ "svelte": "^3.46.2"
+ }
+}
diff --git a/packages/frontend-core/src/api/analytics.js b/packages/frontend-core/src/api/analytics.js
new file mode 100644
index 0000000000..0402365e73
--- /dev/null
+++ b/packages/frontend-core/src/api/analytics.js
@@ -0,0 +1,19 @@
+export const buildAnalyticsEndpoints = API => ({
+ /**
+ * Notifies that an end user client app has been loaded.
+ */
+ pingEndUser: async () => {
+ return await API.post({
+ url: `/api/analytics/ping`,
+ })
+ },
+
+ /**
+ * Gets the current status of analytics for this environment
+ */
+ getAnalyticsStatus: async () => {
+ return await API.get({
+ url: "/api/analytics",
+ })
+ },
+})
diff --git a/packages/frontend-core/src/api/app.js b/packages/frontend-core/src/api/app.js
new file mode 100644
index 0000000000..897e735ea1
--- /dev/null
+++ b/packages/frontend-core/src/api/app.js
@@ -0,0 +1,155 @@
+export const buildAppEndpoints = API => ({
+ /**
+ * Fetches screen definition for an app.
+ * @param appId the ID of the app to fetch from
+ */
+ fetchAppPackage: async appId => {
+ return await API.get({
+ url: `/api/applications/${appId}/appPackage`,
+ })
+ },
+
+ /**
+ * Saves and patches metadata about an app.
+ * @param appId the ID of the app to update
+ * @param metadata the app metadata to save
+ */
+ saveAppMetadata: async ({ appId, metadata }) => {
+ return await API.put({
+ url: `/api/applications/${appId}`,
+ body: metadata,
+ })
+ },
+
+ /**
+ * Deploys the current app.
+ */
+ deployAppChanges: async () => {
+ return await API.post({
+ url: "/api/deploy",
+ })
+ },
+
+ /**
+ * Reverts an app to a previous version.
+ * @param appId the app ID to revert
+ */
+ revertAppChanges: async appId => {
+ return await API.post({
+ url: `/api/dev/${appId}/revert`,
+ })
+ },
+
+ /**
+ * Updates an app's version of the client library.
+ * @param appId the app ID to update
+ */
+ updateAppClientVersion: async appId => {
+ return await API.post({
+ url: `/api/applications/${appId}/client/update`,
+ })
+ },
+
+ /**
+ * Reverts an app's version of the client library to the previous version.
+ * @param appId the app ID to revert
+ */
+ revertAppClientVersion: async appId => {
+ return await API.post({
+ url: `/api/applications/${appId}/client/revert`,
+ })
+ },
+
+ /**
+ * Gets a list of app deployments.
+ */
+ getAppDeployments: async () => {
+ return await API.get({
+ url: "/api/deployments",
+ })
+ },
+
+ /**
+ * Creates an app.
+ * @param app the app to create
+ */
+ createApp: async app => {
+ return await API.post({
+ url: "/api/applications",
+ body: app,
+ json: false,
+ })
+ },
+
+ /**
+ * Imports an export of all apps.
+ * @param apps the FormData containing the apps to import
+ */
+ importApps: async apps => {
+ return await API.post({
+ url: "/api/cloud/import",
+ body: apps,
+ json: false,
+ })
+ },
+
+ /**
+ * Unpublishes a published app.
+ * @param appId the production ID of the app to unpublish
+ */
+ unpublishApp: async appId => {
+ return await API.delete({
+ url: `/api/applications/${appId}?unpublish=1`,
+ })
+ },
+
+ /**
+ * Deletes a dev app.
+ * @param appId the dev app ID to delete
+ */
+ deleteApp: async appId => {
+ return await API.delete({
+ url: `/api/applications/${appId}`,
+ })
+ },
+
+ /**
+ * Releases the lock on a dev app.
+ * @param appId the dev app ID to unlock
+ */
+ releaseAppLock: async appId => {
+ return await API.delete({
+ url: `/api/dev/${appId}/lock`,
+ })
+ },
+
+ /**
+ * Syncs an app with the production database.
+ * @param appId the ID of the app to sync
+ */
+ syncApp: async appId => {
+ return await API.post({
+ url: `/api/applications/${appId}/sync`,
+ })
+ },
+
+ /**
+ * Gets a list of apps.
+ */
+ getApps: async () => {
+ return await API.get({
+ url: "/api/applications?status=all",
+ })
+ },
+
+ /**
+ * Fetches the definitions for component library components. This includes
+ * their props and other metadata from components.json.
+ * @param {string} appId - ID of the currently running app
+ */
+ fetchComponentLibDefinitions: async appId => {
+ return await API.get({
+ url: `/api/${appId}/components/definitions`,
+ })
+ },
+})
diff --git a/packages/frontend-core/src/api/attachments.js b/packages/frontend-core/src/api/attachments.js
new file mode 100644
index 0000000000..2077c4f7ef
--- /dev/null
+++ b/packages/frontend-core/src/api/attachments.js
@@ -0,0 +1,61 @@
+export const buildAttachmentEndpoints = API => ({
+ /**
+ * Uploads an attachment to the server.
+ * @param data the attachment to upload
+ * @param tableId the table ID to upload to
+ */
+ uploadAttachment: async ({ data, tableId }) => {
+ return await API.post({
+ url: `/api/attachments/${tableId}/upload`,
+ body: data,
+ json: false,
+ })
+ },
+
+ /**
+ * Uploads an attachment to the server as a builder user from the builder.
+ * @param data the data to upload
+ */
+ uploadBuilderAttachment: async data => {
+ return await API.post({
+ url: "/api/attachments/process",
+ body: data,
+ json: false,
+ })
+ },
+
+ /**
+ * Generates a signed URL to upload a file to an external datasource.
+ * @param datasourceId the ID of the datasource to upload to
+ * @param bucket the name of the bucket to upload to
+ * @param key the name of the file to upload to
+ */
+ getSignedDatasourceURL: async ({ datasourceId, bucket, key }) => {
+ return await API.post({
+ url: `/api/attachments/${datasourceId}/url`,
+ body: { bucket, key },
+ })
+ },
+
+ /**
+ * Uploads a file to an external datasource.
+ * @param datasourceId the ID of the datasource to upload to
+ * @param bucket the name of the bucket to upload to
+ * @param key the name of the file to upload to
+ * @param data the file to upload
+ */
+ externalUpload: async ({ datasourceId, bucket, key, data }) => {
+ const { signedUrl, publicUrl } = await API.getSignedDatasourceURL({
+ datasourceId,
+ bucket,
+ key,
+ })
+ await API.put({
+ url: signedUrl,
+ body: data,
+ json: false,
+ external: true,
+ })
+ return { publicUrl }
+ },
+})
diff --git a/packages/frontend-core/src/api/auth.js b/packages/frontend-core/src/api/auth.js
new file mode 100644
index 0000000000..9289d71239
--- /dev/null
+++ b/packages/frontend-core/src/api/auth.js
@@ -0,0 +1,76 @@
+export const buildAuthEndpoints = API => ({
+ /**
+ * Performs a login request.
+ * @param tenantId the ID of the tenant to log in to
+ * @param username the username (email)
+ * @param password the password
+ */
+ logIn: async ({ tenantId, username, password }) => {
+ return await API.post({
+ url: `/api/global/auth/${tenantId}/login`,
+ body: {
+ username,
+ password,
+ },
+ })
+ },
+
+ /**
+ * Logs the user out and invalidates their session.
+ */
+ logOut: async () => {
+ return await API.post({
+ url: "/api/global/auth/logout",
+ })
+ },
+
+ /**
+ * Sets initialisation info.
+ * @param info the info to set
+ */
+ setInitInfo: async info => {
+ return await API.post({
+ url: "/api/global/auth/init",
+ body: info,
+ })
+ },
+
+ /**
+ * Gets the initialisation info.
+ */
+ getInitInfo: async () => {
+ return await API.get({
+ url: "/api/global/auth/init",
+ })
+ },
+
+ /**
+ * Sends a password reset email.
+ * @param tenantId the ID of the tenant the user is in
+ * @param email the email address of the user
+ */
+ requestForgotPassword: async ({ tenantId, email }) => {
+ return await API.post({
+ url: `/api/global/auth/${tenantId}/reset`,
+ body: {
+ email,
+ },
+ })
+ },
+
+ /**
+ * Resets a user's password.
+ * @param tenantId the ID of the tenant the user is in
+ * @param password the new password to set
+ * @param resetCode the reset code to authenticate the request
+ */
+ resetPassword: async ({ tenantId, password, resetCode }) => {
+ return await API.post({
+ url: `/api/global/auth/${tenantId}/reset/update`,
+ body: {
+ password,
+ resetCode,
+ },
+ })
+ },
+})
diff --git a/packages/frontend-core/src/api/automations.js b/packages/frontend-core/src/api/automations.js
new file mode 100644
index 0000000000..3b60b6dbac
--- /dev/null
+++ b/packages/frontend-core/src/api/automations.js
@@ -0,0 +1,76 @@
+export const buildAutomationEndpoints = API => ({
+ /**
+ * Executes an automation. Must have "App Action" trigger.
+ * @param automationId the ID of the automation to trigger
+ * @param fields the fields to trigger the automation with
+ */
+ triggerAutomation: async ({ automationId, fields }) => {
+ return await API.post({
+ url: `/api/automations/${automationId}/trigger`,
+ body: { fields },
+ })
+ },
+
+ /**
+ * Tests an automation with data.
+ * @param automationId the ID of the automation to test
+ * @param testData the test data to run against the automation
+ */
+ testAutomation: async ({ automationId, testData }) => {
+ return await API.post({
+ url: `/api/automations/${automationId}/test`,
+ body: testData,
+ })
+ },
+
+ /**
+ * Gets a list of all automations.
+ */
+ getAutomations: async () => {
+ return await API.get({
+ url: "/api/automations",
+ })
+ },
+
+ /**
+ * Gets a list of all the definitions for blocks in automations.
+ */
+ getAutomationDefinitions: async () => {
+ return await API.get({
+ url: "/api/automations/definitions/list",
+ })
+ },
+
+ /**
+ * Creates an automation.
+ * @param automation the automation to create
+ */
+ createAutomation: async automation => {
+ return await API.post({
+ url: "/api/automations",
+ body: automation,
+ })
+ },
+
+ /**
+ * Updates an automation.
+ * @param automation the automation to update
+ */
+ updateAutomation: async automation => {
+ return await API.put({
+ url: "/api/automations",
+ body: automation,
+ })
+ },
+
+ /**
+ * Deletes an automation
+ * @param automationId the ID of the automation to delete
+ * @param automationRev the rev of the automation to delete
+ */
+ deleteAutomation: async ({ automationId, automationRev }) => {
+ return await API.delete({
+ url: `/api/automations/${automationId}/${automationRev}`,
+ })
+ },
+})
diff --git a/packages/frontend-core/src/api/configs.js b/packages/frontend-core/src/api/configs.js
new file mode 100644
index 0000000000..9e320f7499
--- /dev/null
+++ b/packages/frontend-core/src/api/configs.js
@@ -0,0 +1,86 @@
+export const buildConfigEndpoints = API => ({
+ /**
+ * Saves a global config.
+ * @param config the config to save
+ */
+ saveConfig: async config => {
+ return await API.post({
+ url: "/api/global/configs",
+ body: config,
+ })
+ },
+
+ /**
+ * Gets a global config of a certain type.
+ * @param type the type to fetch
+ */
+ getConfig: async type => {
+ return await API.get({
+ url: `/api/global/configs/${type}`,
+ })
+ },
+
+ /**
+ * Gets the config for a certain tenant.
+ * @param tenantId the tenant ID to get the config for
+ */
+ getTenantConfig: async tenantId => {
+ return await API.get({
+ url: `/api/global/configs/public?tenantId=${tenantId}`,
+ })
+ },
+
+ /**
+ * Gets the OIDC config for a certain tenant.
+ * @param tenantId the tenant ID to get the config for
+ */
+ getOIDCConfig: async tenantId => {
+ return await API.get({
+ url: `/api/global/configs/public/oidc?tenantId=${tenantId}`,
+ })
+ },
+
+ /**
+ * Gets the checklist for a specific tenant.
+ * @param tenantId the tenant ID to get the checklist for
+ */
+ getChecklist: async tenantId => {
+ return await API.get({
+ url: `/api/global/configs/checklist?tenantId=${tenantId}`,
+ })
+ },
+
+ /**
+ * Updates the company logo for the environment.
+ * @param data the logo form data
+ */
+ uploadLogo: async data => {
+ return await API.post({
+ url: "/api/global/configs/upload/settings/logoUrl",
+ body: data,
+ json: false,
+ })
+ },
+
+ /**
+ * Uploads a logo for an OIDC provider.
+ * @param name the name of the OIDC provider
+ * @param data the logo form data to upload
+ */
+ uploadOIDCLogo: async ({ name, data }) => {
+ return await API.post({
+ url: `/api/global/configs/upload/logos_oidc/${name}`,
+ body: data,
+ json: false,
+ })
+ },
+
+ /**
+ * Gets the list of OIDC logos.
+ */
+ getOIDCLogos: async () => {
+ return await API.get({
+ url: "/api/global/configs/logos_oidc",
+ })
+ },
+})
diff --git a/packages/frontend-core/src/api/datasources.js b/packages/frontend-core/src/api/datasources.js
new file mode 100644
index 0000000000..ff72fbf25b
--- /dev/null
+++ b/packages/frontend-core/src/api/datasources.js
@@ -0,0 +1,57 @@
+export const buildDatasourceEndpoints = API => ({
+ /**
+ * Gets a list of datasources.
+ */
+ getDatasources: async () => {
+ return await API.get({
+ url: "/api/datasources",
+ })
+ },
+
+ /**
+ * Prompts the server to build the schema for a datasource.
+ * @param datasourceId the datasource ID to build the schema for
+ */
+ buildDatasourceSchema: async datasourceId => {
+ return await API.post({
+ url: `/api/datasources/${datasourceId}/schema`,
+ })
+ },
+
+ /**
+ * Creates a datasource
+ * @param datasource the datasource to create
+ * @param fetchSchema whether to fetch the schema or not
+ */
+ createDatasource: async ({ datasource, fetchSchema }) => {
+ return await API.post({
+ url: "/api/datasources",
+ body: {
+ datasource,
+ fetchSchema,
+ },
+ })
+ },
+
+ /**
+ * Updates a datasource
+ * @param datasource the datasource to update
+ */
+ updateDatasource: async datasource => {
+ return await API.put({
+ url: `/api/datasources/${datasource._id}`,
+ body: datasource,
+ })
+ },
+
+ /**
+ * Deletes a datasource.
+ * @param datasourceId the ID of the ddtasource to delete
+ * @param datasourceRev the rev of the datasource to delete
+ */
+ deleteDatasource: async ({ datasourceId, datasourceRev }) => {
+ return await API.delete({
+ url: `/api/datasources/${datasourceId}/${datasourceRev}`,
+ })
+ },
+})
diff --git a/packages/frontend-core/src/api/flags.js b/packages/frontend-core/src/api/flags.js
new file mode 100644
index 0000000000..bb545e83b9
--- /dev/null
+++ b/packages/frontend-core/src/api/flags.js
@@ -0,0 +1,25 @@
+export const buildFlagEndpoints = API => ({
+ /**
+ * Gets the current user flags object.
+ */
+ getFlags: async () => {
+ return await API.get({
+ url: "/api/users/flags",
+ })
+ },
+
+ /**
+ * Updates a flag for the current user.
+ * @param flag the flag to update
+ * @param value the value to set the flag to
+ */
+ updateFlag: async ({ flag, value }) => {
+ return await API.post({
+ url: "/api/users/flags",
+ body: {
+ flag,
+ value,
+ },
+ })
+ },
+})
diff --git a/packages/frontend-core/src/api/hosting.js b/packages/frontend-core/src/api/hosting.js
new file mode 100644
index 0000000000..8c398f9ae7
--- /dev/null
+++ b/packages/frontend-core/src/api/hosting.js
@@ -0,0 +1,19 @@
+export const buildHostingEndpoints = API => ({
+ /**
+ * Gets the hosting URLs of the environment.
+ */
+ getHostingURLs: async () => {
+ return await API.get({
+ url: "/api/hosting/urls",
+ })
+ },
+
+ /**
+ * Gets the list of deployed apps.
+ */
+ getDeployedApps: async () => {
+ return await API.get({
+ url: "/api/hosting/apps",
+ })
+ },
+})
diff --git a/packages/frontend-core/src/api/index.js b/packages/frontend-core/src/api/index.js
new file mode 100644
index 0000000000..0d9ab4449f
--- /dev/null
+++ b/packages/frontend-core/src/api/index.js
@@ -0,0 +1,235 @@
+import { ApiVersion } from "../constants"
+import { buildAnalyticsEndpoints } from "./analytics"
+import { buildAppEndpoints } from "./app"
+import { buildAttachmentEndpoints } from "./attachments"
+import { buildAuthEndpoints } from "./auth"
+import { buildAutomationEndpoints } from "./automations"
+import { buildConfigEndpoints } from "./configs"
+import { buildDatasourceEndpoints } from "./datasources"
+import { buildFlagEndpoints } from "./flags"
+import { buildHostingEndpoints } from "./hosting"
+import { buildLayoutEndpoints } from "./layouts"
+import { buildOtherEndpoints } from "./other"
+import { buildPermissionsEndpoints } from "./permissions"
+import { buildQueryEndpoints } from "./queries"
+import { buildRelationshipEndpoints } from "./relationships"
+import { buildRoleEndpoints } from "./roles"
+import { buildRouteEndpoints } from "./routes"
+import { buildRowEndpoints } from "./rows"
+import { buildScreenEndpoints } from "./screens"
+import { buildTableEndpoints } from "./tables"
+import { buildTemplateEndpoints } from "./templates"
+import { buildUserEndpoints } from "./user"
+import { buildViewEndpoints } from "./views"
+
+const defaultAPIClientConfig = {
+ /**
+ * Certain definitions can't change at runtime for client apps, such as the
+ * schema of tables. The endpoints that are cacheable can be cached by passing
+ * in this flag. It's disabled by default to avoid bugs with stale data.
+ */
+ enableCaching: false,
+
+ /**
+ * A function can be passed in to attach headers to all outgoing requests.
+ * This function is passed in the headers object, which should be directly
+ * mutated. No return value is required.
+ */
+ attachHeaders: null,
+
+ /**
+ * A function can be passed in which will be invoked any time an API error
+ * occurs. An error is defined as a status code >= 400. This function is
+ * invoked before the actual JS error is thrown up the stack.
+ */
+ onError: null,
+}
+
+/**
+ * Constructs an API client with the provided configuration.
+ * @param config the API client configuration
+ * @return {object} the API client
+ */
+export const createAPIClient = config => {
+ config = {
+ ...defaultAPIClientConfig,
+ ...config,
+ }
+
+ // Generates an error object from an API response
+ const makeErrorFromResponse = async (response, method) => {
+ // Try to read a message from the error
+ let message = response.statusText
+ let json = null
+ try {
+ json = await response.json()
+ if (json?.message) {
+ message = json.message
+ } else if (json?.error) {
+ message = json.error
+ }
+ } catch (error) {
+ // Do nothing
+ }
+ return {
+ message,
+ json,
+ status: response.status,
+ url: response.url,
+ method,
+ handled: true,
+ }
+ }
+
+ // Generates an error object from a string
+ const makeError = (message, request) => {
+ return {
+ message,
+ json: null,
+ status: 400,
+ url: request?.url,
+ method: request?.method,
+ handled: true,
+ }
+ }
+
+ // Performs an API call to the server.
+ const makeApiCall = async ({
+ method,
+ url,
+ body,
+ json = true,
+ external = false,
+ parseResponse,
+ }) => {
+ // Ensure we don't do JSON processing if sending a GET request
+ json = json && method !== "GET"
+
+ // Build headers
+ let headers = { Accept: "application/json" }
+ if (!external) {
+ headers["x-budibase-api-version"] = ApiVersion
+ }
+ if (json) {
+ headers["Content-Type"] = "application/json"
+ }
+ if (config?.attachHeaders) {
+ config.attachHeaders(headers)
+ }
+
+ // Build request body
+ let requestBody = body
+ if (json) {
+ try {
+ requestBody = JSON.stringify(body)
+ } catch (error) {
+ throw makeError("Invalid JSON body", { url, method })
+ }
+ }
+
+ // Make request
+ let response
+ try {
+ response = await fetch(url, {
+ method,
+ headers,
+ body: requestBody,
+ credentials: "same-origin",
+ })
+ } catch (error) {
+ throw makeError("Failed to send request", { url, method })
+ }
+
+ // Handle response
+ if (response.status >= 200 && response.status < 400) {
+ try {
+ if (parseResponse) {
+ return await parseResponse(response)
+ } else {
+ return await response.json()
+ }
+ } catch (error) {
+ return null
+ }
+ } else {
+ throw await makeErrorFromResponse(response, method)
+ }
+ }
+
+ // Performs an API call to the server and caches the response.
+ // Future invocation for this URL will return the cached result instead of
+ // hitting the server again.
+ let cache = {}
+ const makeCachedApiCall = async params => {
+ const identifier = params.url
+ if (!identifier) {
+ return null
+ }
+ if (!cache[identifier]) {
+ cache[identifier] = makeApiCall(params)
+ cache[identifier] = await cache[identifier]
+ }
+ return await cache[identifier]
+ }
+
+ // Constructs an API call function for a particular HTTP method
+ const requestApiCall = method => async params => {
+ try {
+ let { url, cache = false, external = false } = params
+ if (!external) {
+ url = `/${url}`.replace("//", "/")
+ }
+
+ // Cache the request if possible and desired
+ const cacheRequest = cache && config?.enableCaching
+ const handler = cacheRequest ? makeCachedApiCall : makeApiCall
+
+ const enrichedParams = { ...params, method, url }
+ return await handler(enrichedParams)
+ } catch (error) {
+ if (config?.onError) {
+ config.onError(error)
+ }
+ throw error
+ }
+ }
+
+ // Build the underlying core API methods
+ let API = {
+ post: requestApiCall("POST"),
+ get: requestApiCall("GET"),
+ patch: requestApiCall("PATCH"),
+ delete: requestApiCall("DELETE"),
+ put: requestApiCall("PUT"),
+ error: message => {
+ throw makeError(message)
+ },
+ }
+
+ // Attach all endpoints
+ return {
+ ...API,
+ ...buildAnalyticsEndpoints(API),
+ ...buildAppEndpoints(API),
+ ...buildAttachmentEndpoints(API),
+ ...buildAuthEndpoints(API),
+ ...buildAutomationEndpoints(API),
+ ...buildConfigEndpoints(API),
+ ...buildDatasourceEndpoints(API),
+ ...buildFlagEndpoints(API),
+ ...buildHostingEndpoints(API),
+ ...buildLayoutEndpoints(API),
+ ...buildOtherEndpoints(API),
+ ...buildPermissionsEndpoints(API),
+ ...buildQueryEndpoints(API),
+ ...buildRelationshipEndpoints(API),
+ ...buildRoleEndpoints(API),
+ ...buildRouteEndpoints(API),
+ ...buildRowEndpoints(API),
+ ...buildScreenEndpoints(API),
+ ...buildTableEndpoints(API),
+ ...buildTemplateEndpoints(API),
+ ...buildUserEndpoints(API),
+ ...buildViewEndpoints(API),
+ }
+}
diff --git a/packages/frontend-core/src/api/layouts.js b/packages/frontend-core/src/api/layouts.js
new file mode 100644
index 0000000000..51bce1f533
--- /dev/null
+++ b/packages/frontend-core/src/api/layouts.js
@@ -0,0 +1,23 @@
+export const buildLayoutEndpoints = API => ({
+ /**
+ * Saves a layout.
+ * @param layout the layout to save
+ */
+ saveLayout: async layout => {
+ return await API.post({
+ url: "/api/layouts",
+ body: layout,
+ })
+ },
+
+ /**
+ * Deletes a layout.
+ * @param layoutId the ID of the layout to delete
+ * @param layoutRev the rev of the layout to delete
+ */
+ deleteLayout: async ({ layoutId, layoutRev }) => {
+ return await API.delete({
+ url: `/api/layouts/${layoutId}/${layoutRev}`,
+ })
+ },
+})
diff --git a/packages/frontend-core/src/api/other.js b/packages/frontend-core/src/api/other.js
new file mode 100644
index 0000000000..b2a5ccf441
--- /dev/null
+++ b/packages/frontend-core/src/api/other.js
@@ -0,0 +1,46 @@
+export const buildOtherEndpoints = API => ({
+ /**
+ * TODO: find out what this is
+ */
+ checkImportComplete: async () => {
+ return await API.get({
+ url: "/api/cloud/import/complete",
+ })
+ },
+
+ /**
+ * Gets the current environment details.
+ */
+ getEnvironment: async () => {
+ return await API.get({
+ url: "/api/system/environment",
+ })
+ },
+
+ /**
+ * Gets the list of available integrations.
+ */
+ getIntegrations: async () => {
+ return await API.get({
+ url: "/api/integrations",
+ })
+ },
+
+ /**
+ * Gets the version of the installed Budibase environment.
+ */
+ getBudibaseVersion: async () => {
+ return await API.get({
+ url: "/api/dev/version",
+ })
+ },
+
+ /**
+ * Gets the base permissions for roles.
+ */
+ getBasePermissions: async () => {
+ return await API.get({
+ url: "/api/permission/builtin",
+ })
+ },
+})
diff --git a/packages/frontend-core/src/api/permissions.js b/packages/frontend-core/src/api/permissions.js
new file mode 100644
index 0000000000..5407cb3ce5
--- /dev/null
+++ b/packages/frontend-core/src/api/permissions.js
@@ -0,0 +1,24 @@
+export const buildPermissionsEndpoints = API => ({
+ /**
+ * Gets the permission required to access a specific resource
+ * @param resourceId the resource ID to check
+ */
+ getPermissionForResource: async resourceId => {
+ return await API.get({
+ url: `/api/permission/${resourceId}`,
+ })
+ },
+
+ /**
+ * Updates the permissions for a certain resource
+ * @param resourceId the ID of the resource to update
+ * @param roleId the ID of the role to update the permissions of
+ * @param level the level to assign the role for this resource
+ * @return {Promise<*>}
+ */
+ updatePermissionForResource: async ({ resourceId, roleId, level }) => {
+ return await API.post({
+ url: `/api/permission/${roleId}/${resourceId}/${level}`,
+ })
+ },
+})
diff --git a/packages/frontend-core/src/api/queries.js b/packages/frontend-core/src/api/queries.js
new file mode 100644
index 0000000000..f18ec7c4ec
--- /dev/null
+++ b/packages/frontend-core/src/api/queries.js
@@ -0,0 +1,85 @@
+export const buildQueryEndpoints = API => ({
+ /**
+ * Executes a query against an external data connector.
+ * @param queryId the ID of the query to execute
+ * @param pagination pagination info for the query
+ * @param parameters parameters for the query
+ */
+ executeQuery: async ({ queryId, pagination, parameters }) => {
+ return await API.post({
+ url: `/api/v2/queries/${queryId}`,
+ body: {
+ parameters,
+ pagination,
+ },
+ })
+ },
+
+ /**
+ * Fetches the definition of an external query.
+ * @param queryId the ID of thr query to fetch the definition of
+ */
+ fetchQueryDefinition: async queryId => {
+ return await API.get({
+ url: `/api/queries/${queryId}`,
+ cache: true,
+ })
+ },
+
+ /**
+ * Gets a list of queries
+ */
+ getQueries: async () => {
+ return await API.get({
+ url: "/api/queries",
+ })
+ },
+
+ /**
+ * Saves a query.
+ * @param query the query to save
+ */
+ saveQuery: async query => {
+ return await API.post({
+ url: "/api/queries",
+ body: query,
+ })
+ },
+
+ /**
+ * Deletes a query
+ * @param queryId the ID of the query to delete
+ * @param queryRev the rev of the query to delete
+ */
+ deleteQuery: async ({ queryId, queryRev }) => {
+ return await API.delete({
+ url: `/api/queries/${queryId}/${queryRev}`,
+ })
+ },
+
+ /**
+ * Imports a set of queries into a certain datasource
+ * @param datasourceId the datasource ID to import queries into
+ * @param data the data string of the content to import
+ */
+ importQueries: async ({ datasourceId, data }) => {
+ return await API.post({
+ url: "/api/queries/import",
+ body: {
+ datasourceId,
+ data,
+ },
+ })
+ },
+
+ /**
+ * Runs a query with test parameters to see the result.
+ * @param query the query to run
+ */
+ previewQuery: async query => {
+ return await API.post({
+ url: "/api/queries/preview",
+ body: query,
+ })
+ },
+})
diff --git a/packages/frontend-core/src/api/relationships.js b/packages/frontend-core/src/api/relationships.js
new file mode 100644
index 0000000000..fbc727f8e1
--- /dev/null
+++ b/packages/frontend-core/src/api/relationships.js
@@ -0,0 +1,19 @@
+export const buildRelationshipEndpoints = API => ({
+ /**
+ * Fetches related rows for a certain field of a certain row.
+ * @param tableId the ID of the table to fetch from
+ * @param rowId the ID of the row to fetch related rows for
+ * @param fieldName the name of the relationship field
+ */
+ fetchRelationshipData: async ({ tableId, rowId, fieldName }) => {
+ if (!tableId || !rowId) {
+ return []
+ }
+ const response = await API.get({ url: `/api/${tableId}/${rowId}/enrich` })
+ if (!fieldName) {
+ return response || []
+ } else {
+ return response[fieldName] || []
+ }
+ },
+})
diff --git a/packages/frontend-core/src/api/roles.js b/packages/frontend-core/src/api/roles.js
new file mode 100644
index 0000000000..15c27091c4
--- /dev/null
+++ b/packages/frontend-core/src/api/roles.js
@@ -0,0 +1,32 @@
+export const buildRoleEndpoints = API => ({
+ /**
+ * Deletes a role.
+ * @param roleId the ID of the role to delete
+ * @param roleRev the rev of the role to delete
+ */
+ deleteRole: async ({ roleId, roleRev }) => {
+ return await API.delete({
+ url: `/api/roles/${roleId}/${roleRev}`,
+ })
+ },
+
+ /**
+ * Saves a role.
+ * @param role the role to save
+ */
+ saveRole: async role => {
+ return await API.post({
+ url: "/api/roles",
+ body: role,
+ })
+ },
+
+ /**
+ * Gets a list of roles.
+ */
+ getRoles: async () => {
+ return await API.get({
+ url: "/api/roles",
+ })
+ },
+})
diff --git a/packages/frontend-core/src/api/routes.js b/packages/frontend-core/src/api/routes.js
new file mode 100644
index 0000000000..28e206debc
--- /dev/null
+++ b/packages/frontend-core/src/api/routes.js
@@ -0,0 +1,19 @@
+export const buildRouteEndpoints = API => ({
+ /**
+ * Fetches available routes for the client app.
+ */
+ fetchClientAppRoutes: async () => {
+ return await API.get({
+ url: `/api/routing/client`,
+ })
+ },
+
+ /**
+ * Fetches all routes for the current app.
+ */
+ fetchAppRoutes: async () => {
+ return await API.get({
+ url: "/api/routing",
+ })
+ },
+})
diff --git a/packages/frontend-core/src/api/rows.js b/packages/frontend-core/src/api/rows.js
new file mode 100644
index 0000000000..553cf8e0de
--- /dev/null
+++ b/packages/frontend-core/src/api/rows.js
@@ -0,0 +1,63 @@
+export const buildRowEndpoints = API => ({
+ /**
+ * Fetches data about a certain row in a table.
+ * @param tableId the ID of the table to fetch from
+ * @param rowId the ID of the row to fetch
+ */
+ fetchRow: async ({ tableId, rowId }) => {
+ if (!tableId || !rowId) {
+ return null
+ }
+ const row = await API.get({
+ url: `/api/${tableId}/rows/${rowId}`,
+ })
+ return (await API.enrichRows([row], tableId))[0]
+ },
+
+ /**
+ * Creates or updates a row in a table.
+ * @param row the row to save
+ */
+ saveRow: async row => {
+ if (!row?.tableId) {
+ return
+ }
+ return await API.post({
+ url: `/api/${row.tableId}/rows`,
+ body: row,
+ })
+ },
+
+ /**
+ * Deletes a row from a table.
+ * @param tableId the ID of the table to delete from
+ * @param rowId the ID of the row to delete
+ * @param revId the rev of the row to delete
+ */
+ deleteRow: async ({ tableId, rowId, revId }) => {
+ if (!tableId || !rowId || !revId) {
+ return
+ }
+ return await API.delete({
+ url: `/api/${tableId}/rows`,
+ body: {
+ _id: rowId,
+ _rev: revId,
+ },
+ })
+ },
+
+ /**
+ * Deletes multiple rows from a table.
+ * @param tableId the table ID to delete the rows from
+ * @param rows the array of rows to delete
+ */
+ deleteRows: async ({ tableId, rows }) => {
+ return await API.delete({
+ url: `/api/${tableId}/rows`,
+ body: {
+ rows,
+ },
+ })
+ },
+})
diff --git a/packages/frontend-core/src/api/screens.js b/packages/frontend-core/src/api/screens.js
new file mode 100644
index 0000000000..1daa79153b
--- /dev/null
+++ b/packages/frontend-core/src/api/screens.js
@@ -0,0 +1,23 @@
+export const buildScreenEndpoints = API => ({
+ /**
+ * Saves a screen definition
+ * @param screen the screen to save
+ */
+ saveScreen: async screen => {
+ return await API.post({
+ url: "/api/screens",
+ body: screen,
+ })
+ },
+
+ /**
+ * Deletes a screen.
+ * @param screenId the ID of the screen to delete
+ * @param screenRev the rev of the screen to delete
+ */
+ deleteScreen: async ({ screenId, screenRev }) => {
+ return await API.delete({
+ url: `/api/screens/${screenId}/${screenRev}`,
+ })
+ },
+})
diff --git a/packages/frontend-core/src/api/tables.js b/packages/frontend-core/src/api/tables.js
new file mode 100644
index 0000000000..279d3ba6fb
--- /dev/null
+++ b/packages/frontend-core/src/api/tables.js
@@ -0,0 +1,123 @@
+export const buildTableEndpoints = API => ({
+ /**
+ * Fetches a table definition.
+ * Since definitions cannot change at runtime, the result is cached.
+ * @param tableId the ID of the table to fetch
+ */
+ fetchTableDefinition: async tableId => {
+ return await API.get({
+ url: `/api/tables/${tableId}`,
+ cache: true,
+ })
+ },
+
+ /**
+ * Fetches all rows from a table.
+ * @param tableId the ID of the table for fetch
+ */
+ fetchTableData: async tableId => {
+ return await API.get({ url: `/api/${tableId}/rows` })
+ },
+
+ /**
+ * Searches a table using Lucene.
+ * @param tableId the ID of the table to search
+ * @param query the lucene search query
+ * @param bookmark the current pagination bookmark
+ * @param limit the number of rows to retrieve
+ * @param sort the field to sort by
+ * @param sortOrder the order to sort by
+ * @param sortType the type to sort by, either numerically or alphabetically
+ * @param paginate whether to paginate the data
+ */
+ searchTable: async ({
+ tableId,
+ query,
+ bookmark,
+ limit,
+ sort,
+ sortOrder,
+ sortType,
+ paginate,
+ }) => {
+ if (!tableId || !query) {
+ return {
+ rows: [],
+ }
+ }
+ return await API.post({
+ url: `/api/${tableId}/search`,
+ body: {
+ query,
+ bookmark,
+ limit,
+ sort,
+ sortOrder,
+ sortType,
+ paginate,
+ },
+ })
+ },
+
+ /**
+ * Imports data into an existing table
+ * @param tableId the table ID to import to
+ * @param data the data import object
+ */
+ importTableData: async ({ tableId, data }) => {
+ return await API.post({
+ url: `/api/tables/${tableId}/import`,
+ body: {
+ dataImport: data,
+ },
+ })
+ },
+
+ /**
+ * Validates a candidate CSV to be imported for a certain table.
+ * @param tableId the table ID to import to
+ * @param csvString the CSV contents as a string
+ * @param schema the proposed schema
+ */
+ validateTableCSV: async ({ tableId, csvString, schema }) => {
+ return await API.post({
+ url: "/api/tables/csv/validate",
+ body: {
+ csvString,
+ schema,
+ tableId,
+ },
+ })
+ },
+
+ /**
+ * Gets a list o tables.
+ */
+ getTables: async () => {
+ return await API.get({
+ url: "/api/tables",
+ })
+ },
+
+ /**
+ * Saves a table.
+ * @param table the table to save
+ */
+ saveTable: async table => {
+ return await API.post({
+ url: "/api/tables",
+ body: table,
+ })
+ },
+
+ /**
+ * Deletes a table.
+ * @param tableId the ID of the table to delete
+ * @param tableRev the rev of the table to delete
+ */
+ deleteTable: async ({ tableId, tableRev }) => {
+ return await API.delete({
+ url: `/api/tables/${tableId}/${tableRev}`,
+ })
+ },
+})
diff --git a/packages/frontend-core/src/api/templates.js b/packages/frontend-core/src/api/templates.js
new file mode 100644
index 0000000000..3c474dabc6
--- /dev/null
+++ b/packages/frontend-core/src/api/templates.js
@@ -0,0 +1,35 @@
+export const buildTemplateEndpoints = API => ({
+ /**
+ * Gets the list of email template definitions.
+ */
+ getEmailTemplateDefinitions: async () => {
+ return await API.get({ url: "/api/global/template/definitions" })
+ },
+
+ /**
+ * Gets the list of email templates.
+ */
+ getEmailTemplates: async () => {
+ return await API.get({ url: "/api/global/template/email" })
+ },
+
+ /**
+ * Saves an email template.
+ * @param template the template to save
+ */
+ saveEmailTemplate: async template => {
+ return await API.post({
+ url: "/api/global/template",
+ body: template,
+ })
+ },
+
+ /**
+ * Gets a list of app templates.
+ */
+ getAppTemplates: async () => {
+ return await API.get({
+ url: "/api/templates?type=app",
+ })
+ },
+})
diff --git a/packages/frontend-core/src/api/user.js b/packages/frontend-core/src/api/user.js
new file mode 100644
index 0000000000..d1fe7f7251
--- /dev/null
+++ b/packages/frontend-core/src/api/user.js
@@ -0,0 +1,129 @@
+export const buildUserEndpoints = API => ({
+ /**
+ * Fetches the currently logged-in user object.
+ * Used in client apps.
+ */
+ fetchSelf: async () => {
+ return await API.get({
+ url: "/api/self",
+ })
+ },
+
+ /**
+ * Fetches the currently logged-in user object.
+ * Used in the builder.
+ */
+ fetchBuilderSelf: async () => {
+ return await API.get({
+ url: "/api/global/users/self",
+ })
+ },
+
+ /**
+ * Gets a list of users in the current tenant.
+ */
+ getUsers: async () => {
+ return await API.get({
+ url: "/api/global/users",
+ })
+ },
+
+ /**
+ * Creates a user for an app.
+ * @param user the user to create
+ */
+ createAppUser: async user => {
+ return await API.post({
+ url: "/api/users/metadata",
+ body: user,
+ })
+ },
+
+ /**
+ * Updates the current user metadata.
+ * @param metadata the metadata to save
+ */
+ updateOwnMetadata: async metadata => {
+ return await API.post({
+ url: "/api/users/metadata/self",
+ body: metadata,
+ })
+ },
+
+ /**
+ * Creates an admin user.
+ * @param adminUser the admin user to create
+ */
+ createAdminUser: async adminUser => {
+ return await API.post({
+ url: "/api/global/users/init",
+ body: adminUser,
+ })
+ },
+
+ /**
+ * Updates the current logged-in user.
+ * @param user the new user object to save
+ */
+ updateSelf: async user => {
+ return await API.post({
+ url: "/api/global/users/self",
+ body: user,
+ })
+ },
+
+ /**
+ * Creates or updates a user in the current tenant.
+ * @param user the new user to create
+ */
+ saveUser: async user => {
+ return await API.post({
+ url: "/api/global/users",
+ body: user,
+ })
+ },
+
+ /**
+ * Deletes a user from the curernt tenant.
+ * @param userId the ID of the user to delete
+ */
+ deleteUser: async userId => {
+ return await API.delete({
+ url: `/api/global/users/${userId}`,
+ })
+ },
+
+ /**
+ * Invites a user to the current tenant.
+ * @param email the email address to send the invitation to
+ * @param builder whether the user should be a global builder
+ * @param admin whether the user should be a global admin
+ */
+ inviteUser: async ({ email, builder, admin }) => {
+ return await API.post({
+ url: "/api/global/users/invite",
+ body: {
+ email,
+ userInfo: {
+ admin: admin ? { global: true } : undefined,
+ builder: builder ? { global: true } : undefined,
+ },
+ },
+ })
+ },
+
+ /**
+ * Accepts an invitation to join the platform and creates a user.
+ * @param inviteCode the invite code sent in the email
+ * @param password the password for the newly created user
+ */
+ acceptInvitation: async ({ inviteCode, password }) => {
+ return await API.post({
+ url: "/api/global/users/invite/accept",
+ body: {
+ inviteCode,
+ password,
+ },
+ })
+ },
+})
diff --git a/packages/frontend-core/src/api/views.js b/packages/frontend-core/src/api/views.js
new file mode 100644
index 0000000000..237a66bc13
--- /dev/null
+++ b/packages/frontend-core/src/api/views.js
@@ -0,0 +1,59 @@
+export const buildViewEndpoints = API => ({
+ /**
+ * Fetches all rows in a view
+ * @param name the name of the view
+ * @param field the field to perform the calculation on
+ * @param groupBy the field to group by
+ * @param calculation the calculation to perform
+ */
+ fetchViewData: async ({ name, field, groupBy, calculation }) => {
+ const params = new URLSearchParams()
+ if (calculation) {
+ params.set("field", field)
+ params.set("calculation", calculation)
+ }
+ if (groupBy) {
+ params.set("group", groupBy)
+ }
+ const QUERY_VIEW_URL = field
+ ? `/api/views/${name}?${params}`
+ : `/api/views/${name}`
+ return await API.get({ url: QUERY_VIEW_URL })
+ },
+
+ /**
+ * Exports a view for download
+ * @param viewName the view to export
+ * @param format the format to download
+ */
+ exportView: async ({ viewName, format }) => {
+ const safeViewName = encodeURIComponent(viewName)
+ return await API.get({
+ url: `/api/views/export?view=${safeViewName}&format=${format}`,
+ parseResponse: async response => {
+ return await response.text()
+ },
+ })
+ },
+
+ /**
+ * Saves a view.
+ * @param view the view to save
+ */
+ saveView: async view => {
+ return await API.post({
+ url: "/api/views",
+ body: view,
+ })
+ },
+
+ /**
+ * Deletes a view.
+ * @param viewName the name of the view to delete
+ */
+ deleteView: async viewName => {
+ return await API.delete({
+ url: `/api/views/${viewName}`,
+ })
+ },
+})
diff --git a/packages/frontend-core/src/constants.js b/packages/frontend-core/src/constants.js
new file mode 100644
index 0000000000..33cf1b7d1c
--- /dev/null
+++ b/packages/frontend-core/src/constants.js
@@ -0,0 +1,65 @@
+/**
+ * Operator options for lucene queries
+ */
+export const OperatorOptions = {
+ Equals: {
+ value: "equal",
+ label: "Equals",
+ },
+ NotEquals: {
+ value: "notEqual",
+ label: "Not equals",
+ },
+ Empty: {
+ value: "empty",
+ label: "Is empty",
+ },
+ NotEmpty: {
+ value: "notEmpty",
+ label: "Is not empty",
+ },
+ StartsWith: {
+ value: "string",
+ label: "Starts with",
+ },
+ Like: {
+ value: "fuzzy",
+ label: "Like",
+ },
+ MoreThan: {
+ value: "rangeLow",
+ label: "More than",
+ },
+ LessThan: {
+ value: "rangeHigh",
+ label: "Less than",
+ },
+ Contains: {
+ value: "equal",
+ label: "Contains",
+ },
+ NotContains: {
+ value: "notEqual",
+ label: "Does Not Contain",
+ },
+}
+
+// Cookie names
+export const Cookies = {
+ Auth: "budibase:auth",
+ CurrentApp: "budibase:currentapp",
+ ReturnUrl: "budibase:returnurl",
+}
+
+// Table names
+export const TableNames = {
+ USERS: "ta_users",
+}
+
+/**
+ * API version header attached to all requests.
+ * Version changelog:
+ * v1:
+ * - Coerce types for search endpoint
+ */
+export const ApiVersion = "1"
diff --git a/packages/client/src/utils/fetch/DataFetch.js b/packages/frontend-core/src/fetch/DataFetch.js
similarity index 91%
rename from packages/client/src/utils/fetch/DataFetch.js
rename to packages/frontend-core/src/fetch/DataFetch.js
index 7fe53bb8af..3455ee132b 100644
--- a/packages/client/src/utils/fetch/DataFetch.js
+++ b/packages/frontend-core/src/fetch/DataFetch.js
@@ -1,11 +1,11 @@
import { writable, derived, get } from "svelte/store"
+import { cloneDeep } from "lodash/fp"
import {
buildLuceneQuery,
luceneLimit,
- luceneQuery,
+ runLuceneQuery,
luceneSort,
-} from "builder/src/helpers/lucene"
-import { fetchTableDefinition } from "api"
+} from "../utils/lucene"
/**
* Parent class which handles the implementation of fetching data from an
@@ -13,6 +13,9 @@ import { fetchTableDefinition } from "api"
* For other types of datasource, this class is overridden and extended.
*/
export default class DataFetch {
+ // API client
+ API = null
+
// Feature flags
featureStore = writable({
supportsSearch: false,
@@ -57,10 +60,14 @@ export default class DataFetch {
*/
constructor(opts) {
// Merge options with their default values
+ this.API = opts?.API
this.options = {
...this.options,
...opts,
}
+ if (!this.API) {
+ throw "An API client is required for fetching data"
+ }
// Bind all functions to properly scope "this"
this.getData = this.getData.bind(this)
@@ -112,7 +119,7 @@ export default class DataFetch {
const { datasource, filter, sortColumn, paginate } = this.options
// Fetch datasource definition and determine feature flags
- const definition = await this.constructor.getDefinition(datasource)
+ const definition = await this.getDefinition(datasource)
const features = this.determineFeatureFlags(definition)
this.featureStore.set({
supportsSearch: !!features?.supportsSearch,
@@ -121,8 +128,8 @@ export default class DataFetch {
})
// Fetch and enrich schema
- let schema = this.constructor.getSchema(datasource, definition)
- schema = DataFetch.enrichSchema(schema)
+ let schema = this.getSchema(datasource, definition)
+ schema = this.enrichSchema(schema)
if (!schema) {
return
}
@@ -178,7 +185,7 @@ export default class DataFetch {
// If we don't support searching, do a client search
if (!features.supportsSearch) {
- rows = luceneQuery(rows, query)
+ rows = runLuceneQuery(rows, query)
}
// If we don't support sorting, do a client-side sort
@@ -218,11 +225,15 @@ export default class DataFetch {
* @param datasource
* @return {object} the definition
*/
- static async getDefinition(datasource) {
+ async getDefinition(datasource) {
if (!datasource?.tableId) {
return null
}
- return await fetchTableDefinition(datasource.tableId)
+ try {
+ return await this.API.fetchTableDefinition(datasource.tableId)
+ } catch (error) {
+ return null
+ }
}
/**
@@ -232,7 +243,7 @@ export default class DataFetch {
* @param definition the datasource definition
* @return {object} the schema
*/
- static getSchema(datasource, definition) {
+ getSchema(datasource, definition) {
return definition?.schema
}
@@ -241,7 +252,7 @@ export default class DataFetch {
* @param schema the datasource schema
* @return {object} the enriched datasource schema
*/
- static enrichSchema(schema) {
+ enrichSchema(schema) {
if (schema == null) {
return null
}
@@ -293,10 +304,12 @@ export default class DataFetch {
return
}
- // Assign new options and reload data
+ // Assign new options and reload data.
+ // Clone the new options to ensure that some external source doesn't end up
+ // mutating the real values in the config.
this.options = {
...this.options,
- ...newOptions,
+ ...cloneDeep(newOptions),
}
await this.getInitialData()
}
diff --git a/packages/client/src/utils/fetch/FieldFetch.js b/packages/frontend-core/src/fetch/FieldFetch.js
similarity index 93%
rename from packages/client/src/utils/fetch/FieldFetch.js
rename to packages/frontend-core/src/fetch/FieldFetch.js
index ef9902ed27..9402a45a83 100644
--- a/packages/client/src/utils/fetch/FieldFetch.js
+++ b/packages/frontend-core/src/fetch/FieldFetch.js
@@ -1,7 +1,7 @@
import DataFetch from "./DataFetch.js"
export default class FieldFetch extends DataFetch {
- static async getDefinition(datasource) {
+ async getDefinition(datasource) {
// Field sources have their schema statically defined
let schema
if (datasource.fieldType === "attachment") {
@@ -28,7 +28,7 @@ export default class FieldFetch extends DataFetch {
// These sources will be available directly from context
const data = datasource?.value || []
- let rows = []
+ let rows
if (Array.isArray(data) && data[0] && typeof data[0] !== "object") {
rows = data.map(value => ({ value }))
} else {
diff --git a/packages/frontend-core/src/fetch/JSONArrayFetch.js b/packages/frontend-core/src/fetch/JSONArrayFetch.js
new file mode 100644
index 0000000000..ab2af3e2c7
--- /dev/null
+++ b/packages/frontend-core/src/fetch/JSONArrayFetch.js
@@ -0,0 +1,16 @@
+import FieldFetch from "./FieldFetch.js"
+import { getJSONArrayDatasourceSchema } from "../utils/json"
+
+export default class JSONArrayFetch extends FieldFetch {
+ async getDefinition(datasource) {
+ // JSON arrays need their table definitions fetched.
+ // We can then extract their schema as a subset of the table schema.
+ try {
+ const table = await this.API.fetchTableDefinition(datasource.tableId)
+ const schema = getJSONArrayDatasourceSchema(table?.schema, datasource)
+ return { schema }
+ } catch (error) {
+ return null
+ }
+ }
+}
diff --git a/packages/client/src/utils/fetch/NestedProviderFetch.js b/packages/frontend-core/src/fetch/NestedProviderFetch.js
similarity index 91%
rename from packages/client/src/utils/fetch/NestedProviderFetch.js
rename to packages/frontend-core/src/fetch/NestedProviderFetch.js
index 163d7e9930..01c22b6ba0 100644
--- a/packages/client/src/utils/fetch/NestedProviderFetch.js
+++ b/packages/frontend-core/src/fetch/NestedProviderFetch.js
@@ -1,7 +1,7 @@
import DataFetch from "./DataFetch.js"
export default class NestedProviderFetch extends DataFetch {
- static async getDefinition(datasource) {
+ async getDefinition(datasource) {
// Nested providers should already have exposed their own schema
return {
schema: datasource?.value?.schema,
diff --git a/packages/frontend-core/src/fetch/QueryFetch.js b/packages/frontend-core/src/fetch/QueryFetch.js
new file mode 100644
index 0000000000..1f8e900fb5
--- /dev/null
+++ b/packages/frontend-core/src/fetch/QueryFetch.js
@@ -0,0 +1,86 @@
+import DataFetch from "./DataFetch.js"
+import { cloneDeep } from "lodash/fp"
+import { get } from "svelte/store"
+
+export default class QueryFetch extends DataFetch {
+ determineFeatureFlags(definition) {
+ const supportsPagination =
+ !!definition?.fields?.pagination?.type &&
+ !!definition?.fields?.pagination?.location &&
+ !!definition?.fields?.pagination?.pageParam
+ return { supportsPagination }
+ }
+
+ async getDefinition(datasource) {
+ if (!datasource?._id) {
+ return null
+ }
+ try {
+ const definition = await this.API.fetchQueryDefinition(datasource._id)
+ // After getting the definition of query, it loses "fields" attribute
+ // because of security reason from the server. However, this attribute
+ // needs to be inside the definition for pagination.
+ if (!definition.fields) {
+ definition.fields = datasource.fields
+ }
+ return definition
+ } catch (error) {
+ return null
+ }
+ }
+
+ async getData() {
+ const { datasource, limit, paginate } = this.options
+ const { supportsPagination } = get(this.featureStore)
+ const { cursor, definition } = get(this.store)
+ const type = definition?.fields?.pagination?.type
+
+ // Set the default query params
+ let parameters = cloneDeep(datasource?.queryParams || {})
+ for (let param of datasource?.parameters || {}) {
+ if (!parameters[param.name]) {
+ parameters[param.name] = param.default
+ }
+ }
+
+ // Add pagination to query if supported
+ let queryPayload = { queryId: datasource?._id, parameters }
+ if (paginate && supportsPagination) {
+ const requestCursor = type === "page" ? parseInt(cursor || 1) : cursor
+ queryPayload.pagination = { page: requestCursor, limit }
+ }
+
+ // Execute query
+ try {
+ const res = await this.API.executeQuery(queryPayload)
+ const { data, pagination, ...rest } = res
+
+ // Derive pagination info from response
+ let nextCursor = null
+ let hasNextPage = false
+ if (paginate && supportsPagination) {
+ if (type === "page") {
+ // For "page number" pagination, increment the existing page number
+ nextCursor = queryPayload.pagination.page + 1
+ hasNextPage = data?.length === limit && limit > 0
+ } else {
+ // For "cursor" pagination, the cursor should be in the response
+ nextCursor = pagination?.cursor
+ hasNextPage = nextCursor != null
+ }
+ }
+
+ return {
+ rows: data || [],
+ info: rest,
+ cursor: nextCursor,
+ hasNextPage,
+ }
+ } catch (error) {
+ return {
+ rows: [],
+ hasNextPage: false,
+ }
+ }
+ }
+}
diff --git a/packages/frontend-core/src/fetch/RelationshipFetch.js b/packages/frontend-core/src/fetch/RelationshipFetch.js
new file mode 100644
index 0000000000..04797fcdf1
--- /dev/null
+++ b/packages/frontend-core/src/fetch/RelationshipFetch.js
@@ -0,0 +1,17 @@
+import DataFetch from "./DataFetch.js"
+
+export default class RelationshipFetch extends DataFetch {
+ async getData() {
+ const { datasource } = this.options
+ try {
+ const res = await this.API.fetchRelationshipData({
+ rowId: datasource?.rowId,
+ tableId: datasource?.rowTableId,
+ fieldName: datasource?.fieldName,
+ })
+ return { rows: res || [] }
+ } catch (error) {
+ return { rows: [] }
+ }
+ }
+}
diff --git a/packages/client/src/utils/fetch/TableFetch.js b/packages/frontend-core/src/fetch/TableFetch.js
similarity index 50%
rename from packages/client/src/utils/fetch/TableFetch.js
rename to packages/frontend-core/src/fetch/TableFetch.js
index 16e4dcd7e1..cf0e124020 100644
--- a/packages/client/src/utils/fetch/TableFetch.js
+++ b/packages/frontend-core/src/fetch/TableFetch.js
@@ -1,6 +1,5 @@
import { get } from "svelte/store"
import DataFetch from "./DataFetch.js"
-import { searchTable } from "api"
export default class TableFetch extends DataFetch {
determineFeatureFlags() {
@@ -18,20 +17,27 @@ export default class TableFetch extends DataFetch {
const { cursor, query } = get(this.store)
// Search table
- const res = await searchTable({
- tableId,
- query,
- limit,
- sort: sortColumn,
- sortOrder: sortOrder?.toLowerCase() ?? "ascending",
- sortType,
- paginate,
- bookmark: cursor,
- })
- return {
- rows: res?.rows || [],
- hasNextPage: res?.hasNextPage || false,
- cursor: res?.bookmark || null,
+ try {
+ const res = await this.API.searchTable({
+ tableId,
+ query,
+ limit,
+ sort: sortColumn,
+ sortOrder: sortOrder?.toLowerCase() ?? "ascending",
+ sortType,
+ paginate,
+ bookmark: cursor,
+ })
+ return {
+ rows: res?.rows || [],
+ hasNextPage: res?.hasNextPage || false,
+ cursor: res?.bookmark || null,
+ }
+ } catch (error) {
+ return {
+ rows: [],
+ hasNextPage: false,
+ }
}
}
}
diff --git a/packages/client/src/utils/fetch/ViewFetch.js b/packages/frontend-core/src/fetch/ViewFetch.js
similarity index 54%
rename from packages/client/src/utils/fetch/ViewFetch.js
rename to packages/frontend-core/src/fetch/ViewFetch.js
index 523d8901e0..981969f46c 100644
--- a/packages/client/src/utils/fetch/ViewFetch.js
+++ b/packages/frontend-core/src/fetch/ViewFetch.js
@@ -1,16 +1,17 @@
import DataFetch from "./DataFetch.js"
-import { fetchViewData } from "api"
export default class ViewFetch extends DataFetch {
- static getSchema(datasource, definition) {
+ getSchema(datasource, definition) {
return definition?.views?.[datasource.name]?.schema
}
async getData() {
const { datasource } = this.options
- const res = await fetchViewData(datasource)
- return {
- rows: res || [],
+ try {
+ const res = await this.API.fetchViewData(datasource)
+ return { rows: res || [] }
+ } catch (error) {
+ return { rows: [] }
}
}
}
diff --git a/packages/client/src/utils/fetch/fetchData.js b/packages/frontend-core/src/fetch/fetchData.js
similarity index 79%
rename from packages/client/src/utils/fetch/fetchData.js
rename to packages/frontend-core/src/fetch/fetchData.js
index f93e037b44..e914ff863f 100644
--- a/packages/client/src/utils/fetch/fetchData.js
+++ b/packages/frontend-core/src/fetch/fetchData.js
@@ -11,12 +11,14 @@ const DataFetchMap = {
view: ViewFetch,
query: QueryFetch,
link: RelationshipFetch,
+
+ // Client specific datasource types
provider: NestedProviderFetch,
field: FieldFetch,
jsonarray: JSONArrayFetch,
}
-export const fetchData = (datasource, options) => {
+export const fetchData = ({ API, datasource, options }) => {
const Fetch = DataFetchMap[datasource?.type] || TableFetch
- return new Fetch({ datasource, ...options })
+ return new Fetch({ API, datasource, ...options })
}
diff --git a/packages/frontend-core/src/index.js b/packages/frontend-core/src/index.js
new file mode 100644
index 0000000000..53822fdfb4
--- /dev/null
+++ b/packages/frontend-core/src/index.js
@@ -0,0 +1,7 @@
+export { createAPIClient } from "./api"
+export { createLocalStorageStore } from "./stores/localStorage"
+export { fetchData } from "./fetch/fetchData"
+export * as Constants from "./constants"
+export * as LuceneUtils from "./utils/lucene"
+export * as JSONUtils from "./utils/json"
+export * as CookieUtils from "./utils/cookies"
diff --git a/packages/builder/src/builderStore/store/localStorage.js b/packages/frontend-core/src/stores/localStorage.js
similarity index 93%
rename from packages/builder/src/builderStore/store/localStorage.js
rename to packages/frontend-core/src/stores/localStorage.js
index aaf10460b6..0f958c365d 100644
--- a/packages/builder/src/builderStore/store/localStorage.js
+++ b/packages/frontend-core/src/stores/localStorage.js
@@ -1,6 +1,6 @@
import { get, writable } from "svelte/store"
-export const localStorageStore = (localStorageKey, initialValue) => {
+export const createLocalStorageStore = (localStorageKey, initialValue) => {
const store = writable(initialValue, () => {
// Hydrate from local storage when we get a new subscriber
hydrate()
diff --git a/packages/builder/src/builderStore/cookies.js b/packages/frontend-core/src/utils/cookies.js
similarity index 79%
rename from packages/builder/src/builderStore/cookies.js
rename to packages/frontend-core/src/utils/cookies.js
index cb4e46ec86..5ecaa3424f 100644
--- a/packages/builder/src/builderStore/cookies.js
+++ b/packages/frontend-core/src/utils/cookies.js
@@ -1,9 +1,3 @@
-export const Cookies = {
- Auth: "budibase:auth",
- CurrentApp: "budibase:currentapp",
- ReturnUrl: "budibase:returnurl",
-}
-
export function setCookie(name, value) {
if (getCookie(name)) {
removeCookie(name)
diff --git a/packages/builder/src/builderStore/jsonUtils.js b/packages/frontend-core/src/utils/json.js
similarity index 100%
rename from packages/builder/src/builderStore/jsonUtils.js
rename to packages/frontend-core/src/utils/json.js
diff --git a/packages/builder/src/helpers/lucene.js b/packages/frontend-core/src/utils/lucene.js
similarity index 79%
rename from packages/builder/src/helpers/lucene.js
rename to packages/frontend-core/src/utils/lucene.js
index 63fe542da2..eaf681a78b 100644
--- a/packages/builder/src/helpers/lucene.js
+++ b/packages/frontend-core/src/utils/lucene.js
@@ -1,11 +1,65 @@
-import { NoEmptyFilterStrings } from "../constants/lucene"
-import { deepGet } from "@budibase/bbui"
+import { Helpers } from "@budibase/bbui"
+import { OperatorOptions } from "../constants"
+
+/**
+ * Returns the valid operator options for a certain data type
+ * @param type the data type
+ */
+export const getValidOperatorsForType = type => {
+ const Op = OperatorOptions
+ const stringOps = [
+ Op.Equals,
+ Op.NotEquals,
+ Op.StartsWith,
+ Op.Like,
+ Op.Empty,
+ Op.NotEmpty,
+ ]
+ const numOps = [
+ Op.Equals,
+ Op.NotEquals,
+ Op.MoreThan,
+ Op.LessThan,
+ Op.Empty,
+ Op.NotEmpty,
+ ]
+ if (type === "string") {
+ return stringOps
+ } else if (type === "number") {
+ return numOps
+ } else if (type === "options") {
+ return [Op.Equals, Op.NotEquals, Op.Empty, Op.NotEmpty]
+ } else if (type === "array") {
+ return [Op.Contains, Op.NotContains, Op.Empty, Op.NotEmpty]
+ } else if (type === "boolean") {
+ return [Op.Equals, Op.NotEquals, Op.Empty, Op.NotEmpty]
+ } else if (type === "longform") {
+ return stringOps
+ } else if (type === "datetime") {
+ return numOps
+ } else if (type === "formula") {
+ return stringOps.concat([Op.MoreThan, Op.LessThan])
+ }
+ return []
+}
+
+/**
+ * Operators which do not support empty strings as values
+ */
+export const NoEmptyFilterStrings = [
+ OperatorOptions.StartsWith.value,
+ OperatorOptions.Like.value,
+ OperatorOptions.Equals.value,
+ OperatorOptions.NotEquals.value,
+ OperatorOptions.Contains.value,
+ OperatorOptions.NotContains.value,
+]
/**
* Removes any fields that contain empty strings that would cause inconsistent
* behaviour with how backend tables are filtered (no value means no filter).
*/
-function cleanupQuery(query) {
+const cleanupQuery = query => {
if (!query) {
return query
}
@@ -96,7 +150,7 @@ export const buildLuceneQuery = filter => {
* @param docs the data
* @param query the JSON lucene query
*/
-export const luceneQuery = (docs, query) => {
+export const runLuceneQuery = (docs, query) => {
if (!docs || !Array.isArray(docs)) {
return []
}
@@ -112,7 +166,7 @@ export const luceneQuery = (docs, query) => {
const filters = Object.entries(query[type] || {})
for (let i = 0; i < filters.length; i++) {
const [key, testValue] = filters[i]
- const docValue = deepGet(doc, key)
+ const docValue = Helpers.deepGet(doc, key)
if (failFn(docValue, testValue)) {
return false
}
diff --git a/packages/frontend-core/yarn.lock b/packages/frontend-core/yarn.lock
new file mode 100644
index 0000000000..8d4b2f14a0
--- /dev/null
+++ b/packages/frontend-core/yarn.lock
@@ -0,0 +1,13 @@
+# THIS IS AN AUTOGENERATED FILE. DO NOT EDIT THIS FILE DIRECTLY.
+# yarn lockfile v1
+
+
+lodash@^4.17.21:
+ version "4.17.21"
+ resolved "https://registry.yarnpkg.com/lodash/-/lodash-4.17.21.tgz#679591c564c3bffaae8454cf0b3df370c3d6911c"
+ integrity sha512-v2kDEe57lecTulaDIuNTPy3Ry4gLGJ6Z1O3vE1krgXZNrsQ+LFTGHVxVjcXPs17LhbZVGedAJv8XZ1tvj5FvSg==
+
+svelte@^3.46.2:
+ version "3.46.2"
+ resolved "https://registry.yarnpkg.com/svelte/-/svelte-3.46.2.tgz#f0ffbffaea3a9a760edcbefc0902b41998a686ad"
+ integrity sha512-RXSAtYNefe01Sb1lXtZ2I+gzn3t/h/59hoaRNeRrm8IkMIu6BSiAkbpi41xb+C44x54YKnbk9+dtfs3pM4hECA==
diff --git a/packages/server/package.json b/packages/server/package.json
index e009dedf16..9fb110ca28 100644
--- a/packages/server/package.json
+++ b/packages/server/package.json
@@ -1,7 +1,7 @@
{
"name": "@budibase/server",
"email": "hi@budibase.com",
- "version": "1.0.49-alpha.5",
+ "version": "1.0.50-alpha.6",
"description": "Budibase Web Server",
"main": "src/index.ts",
"repository": {
@@ -70,9 +70,9 @@
"license": "GPL-3.0",
"dependencies": {
"@apidevtools/swagger-parser": "^10.0.3",
- "@budibase/backend-core": "^1.0.49-alpha.5",
- "@budibase/client": "^1.0.49-alpha.5",
- "@budibase/string-templates": "^1.0.49-alpha.5",
+ "@budibase/backend-core": "^1.0.50-alpha.6",
+ "@budibase/client": "^1.0.50-alpha.6",
+ "@budibase/string-templates": "^1.0.50-alpha.6",
"@bull-board/api": "^3.7.0",
"@bull-board/koa": "^3.7.0",
"@elastic/elasticsearch": "7.10.0",
diff --git a/packages/server/scripts/integrations/mssql/data/Dockerfile b/packages/server/scripts/integrations/mssql/data/Dockerfile
index 8ac56409a0..b8c96e8419 100644
--- a/packages/server/scripts/integrations/mssql/data/Dockerfile
+++ b/packages/server/scripts/integrations/mssql/data/Dockerfile
@@ -1,4 +1,4 @@
-FROM mcr.microsoft.com/mssql/server
+FROM mcr.microsoft.com/mssql/server:2017-latest
ENV ACCEPT_EULA=Y
ENV SA_PASSWORD=Passw0rd
diff --git a/packages/server/scripts/integrations/mssql/data/setup.sql b/packages/server/scripts/integrations/mssql/data/setup.sql
index 766388f46a..b6ab4f5274 100644
--- a/packages/server/scripts/integrations/mssql/data/setup.sql
+++ b/packages/server/scripts/integrations/mssql/data/setup.sql
@@ -36,19 +36,28 @@ CREATE TABLE people
INSERT products
(name, description)
VALUES
- ('Bananas', 'Fruit thing');
-
-INSERT products
- (name, description)
-VALUES
+ ('Bananas', 'Fruit thing'),
('Meat', 'Animal thing');
-
+
INSERT tasks
(taskname, productid)
VALUES
('Processing', 1);
-INSERT people
- (name, age)
-VALUES
- ('Bob', '30');
+INSERT INTO people (name, age)
+VALUES ('Bob', '30'),
+ ('Bert', '10'),
+ ('Jack', '12'),
+ ('Mike', '31'),
+ ('Dave', '44'),
+ ('Jim', '43'),
+ ('Kerry', '32'),
+ ('Julie', '12'),
+ ('Kim', '55'),
+ ('Andy', '33'),
+ ('John', '22'),
+ ('Ruth', '66'),
+ ('Robert', '88'),
+ ('Bobert', '99'),
+ ('Jan', '22'),
+ ('Megan', '11');
diff --git a/packages/server/scripts/integrations/mysql/init.sql b/packages/server/scripts/integrations/mysql/init.sql
index 4dd75c36d3..f37ef0d532 100644
--- a/packages/server/scripts/integrations/mysql/init.sql
+++ b/packages/server/scripts/integrations/mysql/init.sql
@@ -19,6 +19,12 @@ CREATE TABLE Tasks (
FOREIGN KEY(PersonID)
REFERENCES Persons(PersonID)
);
+CREATE TABLE Products (
+ id serial primary key,
+ name text,
+ updated time
+);
INSERT INTO Persons (FirstName, LastName, Age, Address, City, CreatedAt) VALUES ('Mike', 'Hughes', 28.2, '123 Fake Street', 'Belfast', '2021-01-19 03:14:07');
INSERT INTO Tasks (PersonID, TaskName) VALUES (1, 'assembling');
INSERT INTO Tasks (PersonID, TaskName) VALUES (1, 'processing');
+INSERT INTO Products (name, updated) VALUES ('Meat', '11:00:22'), ('Fruit', '10:00:00');
diff --git a/packages/server/scripts/integrations/postgres/init.sql b/packages/server/scripts/integrations/postgres/init.sql
index cc2fc734f8..a25106ca32 100644
--- a/packages/server/scripts/integrations/postgres/init.sql
+++ b/packages/server/scripts/integrations/postgres/init.sql
@@ -9,12 +9,16 @@ CREATE TABLE Persons (
);
CREATE TABLE Tasks (
TaskID SERIAL PRIMARY KEY,
- PersonID INT,
+ ExecutorID INT,
+ QaID INT,
Completed BOOLEAN,
TaskName varchar(255),
- CONSTRAINT fkPersons
- FOREIGN KEY(PersonID)
- REFERENCES Persons(PersonID)
+ CONSTRAINT fkexecutor
+ FOREIGN KEY(ExecutorID)
+ REFERENCES Persons(PersonID),
+ CONSTRAINT fkqa
+ FOREIGN KEY(QaID)
+ REFERENCES Persons(PersonID)
);
CREATE TABLE Products (
ProductID SERIAL PRIMARY KEY,
@@ -32,8 +36,9 @@ CREATE TABLE Products_Tasks (
PRIMARY KEY (ProductID, TaskID)
);
INSERT INTO Persons (FirstName, LastName, Address, City) VALUES ('Mike', 'Hughes', '123 Fake Street', 'Belfast');
-INSERT INTO Tasks (PersonID, TaskName, Completed) VALUES (1, 'assembling', TRUE);
-INSERT INTO Tasks (PersonID, TaskName, Completed) VALUES (1, 'processing', FALSE);
+INSERT INTO Persons (FirstName, LastName, Address, City) Values ('John', 'Smith', '64 Updown Road', 'Dublin');
+INSERT INTO Tasks (ExecutorID, QaID, TaskName, Completed) VALUES (1, 2, 'assembling', TRUE);
+INSERT INTO Tasks (ExecutorID, QaID, TaskName, Completed) VALUES (2, 1, 'processing', FALSE);
INSERT INTO Products (ProductName) VALUES ('Computers');
INSERT INTO Products (ProductName) VALUES ('Laptops');
INSERT INTO Products (ProductName) VALUES ('Chairs');
diff --git a/packages/server/src/api/controllers/application.js b/packages/server/src/api/controllers/application.js
index eb1f7bc5e6..00d3efccb8 100644
--- a/packages/server/src/api/controllers/application.js
+++ b/packages/server/src/api/controllers/application.js
@@ -1,4 +1,3 @@
-const CouchDB = require("../../db")
const env = require("../../environment")
const packageJson = require("../../../package.json")
const {
@@ -29,7 +28,7 @@ const { processObject } = require("@budibase/string-templates")
const {
getAllApps,
isDevAppID,
- getDeployedAppID,
+ getProdAppID,
Replication,
} = require("@budibase/backend-core/db")
const { USERS_TABLE_SCHEMA } = require("../../constants")
@@ -45,11 +44,17 @@ const { getTenantId, isMultiTenant } = require("@budibase/backend-core/tenancy")
const { syncGlobalUsers } = require("./user")
const { app: appCache } = require("@budibase/backend-core/cache")
const { cleanupAutomations } = require("../../automations/utils")
+const {
+ getAppDB,
+ getProdAppDB,
+ updateAppId,
+} = require("@budibase/backend-core/context")
const URL_REGEX_SLASH = /\/|\\/g
// utility function, need to do away with this
-async function getLayouts(db) {
+async function getLayouts() {
+ const db = getAppDB()
return (
await db.allDocs(
getLayoutParams(null, {
@@ -59,7 +64,8 @@ async function getLayouts(db) {
).rows.map(row => row.doc)
}
-async function getScreens(db) {
+async function getScreens() {
+ const db = getAppDB()
return (
await db.allDocs(
getScreenParams(null, {
@@ -117,8 +123,9 @@ async function createInstance(template) {
const tenantId = isMultiTenant() ? getTenantId() : null
const baseAppId = generateAppID(tenantId)
const appId = generateDevAppID(baseAppId)
+ updateAppId(appId)
- const db = new CouchDB(appId)
+ const db = getAppDB()
await db.put({
_id: "_design/database",
// view collation information, read before writing any complex views:
@@ -128,9 +135,9 @@ async function createInstance(template) {
// NOTE: indexes need to be created before any tables/templates
// add view for linked rows
- await createLinkView(appId)
- await createRoutingView(appId)
- await createAllSearchIndex(appId)
+ await createLinkView()
+ await createRoutingView()
+ await createAllSearchIndex()
// replicate the template data to the instance DB
// this is currently very hard to test, downloading and importing template files
@@ -156,7 +163,7 @@ async function createInstance(template) {
exports.fetch = async ctx => {
const dev = ctx.query && ctx.query.status === AppStatus.DEV
const all = ctx.query && ctx.query.status === AppStatus.ALL
- const apps = await getAllApps(CouchDB, { dev, all })
+ const apps = await getAllApps({ dev, all })
// get the locks for all the dev apps
if (dev || all) {
@@ -179,12 +186,11 @@ exports.fetch = async ctx => {
}
exports.fetchAppDefinition = async ctx => {
- const db = new CouchDB(ctx.params.appId)
- const layouts = await getLayouts(db)
+ const layouts = await getLayouts()
const userRoleId = getUserRoleId(ctx)
- const accessController = new AccessController(ctx.params.appId)
+ const accessController = new AccessController()
const screens = await accessController.checkScreensAccess(
- await getScreens(db),
+ await getScreens(),
userRoleId
)
ctx.body = {
@@ -195,15 +201,15 @@ exports.fetchAppDefinition = async ctx => {
}
exports.fetchAppPackage = async ctx => {
- const db = new CouchDB(ctx.params.appId)
+ const db = getAppDB()
const application = await db.get(DocumentTypes.APP_METADATA)
- const layouts = await getLayouts(db)
- let screens = await getScreens(db)
+ const layouts = await getLayouts()
+ let screens = await getScreens()
// Only filter screens if the user is not a builder
if (!(ctx.user.builder && ctx.user.builder.global)) {
const userRoleId = getUserRoleId(ctx)
- const accessController = new AccessController(ctx.params.appId)
+ const accessController = new AccessController()
screens = await accessController.checkScreensAccess(screens, userRoleId)
}
@@ -216,7 +222,7 @@ exports.fetchAppPackage = async ctx => {
}
exports.create = async ctx => {
- const apps = await getAllApps(CouchDB, { dev: true })
+ const apps = await getAllApps({ dev: true })
const name = ctx.request.body.name
checkAppName(ctx, apps, name)
const url = exports.getAppUrl(ctx)
@@ -234,7 +240,7 @@ exports.create = async ctx => {
const instance = await createInstance(instanceConfig)
const appId = instance._id
- const db = new CouchDB(appId)
+ const db = getAppDB()
let _rev
try {
// if template there will be an existing doc
@@ -280,7 +286,7 @@ exports.create = async ctx => {
// This endpoint currently operates as a PATCH rather than a PUT
// Thus name and url fields are handled only if present
exports.update = async ctx => {
- const apps = await getAllApps(CouchDB, { dev: true })
+ const apps = await getAllApps({ dev: true })
// validation
const name = ctx.request.body.name
if (name) {
@@ -299,7 +305,7 @@ exports.update = async ctx => {
exports.updateClient = async ctx => {
// Get current app version
- const db = new CouchDB(ctx.params.appId)
+ const db = getAppDB()
const application = await db.get(DocumentTypes.APP_METADATA)
const currentVersion = application.version
@@ -321,7 +327,7 @@ exports.updateClient = async ctx => {
exports.revertClient = async ctx => {
// Check app can be reverted
- const db = new CouchDB(ctx.params.appId)
+ const db = getAppDB()
const application = await db.get(DocumentTypes.APP_METADATA)
if (!application.revertableVersion) {
ctx.throw(400, "There is no version to revert to")
@@ -343,7 +349,7 @@ exports.revertClient = async ctx => {
}
exports.delete = async ctx => {
- const db = new CouchDB(ctx.params.appId)
+ const db = getAppDB()
const result = await db.destroy()
/* istanbul ignore next */
@@ -368,10 +374,11 @@ exports.sync = async (ctx, next) => {
}
// replicate prod to dev
- const prodAppId = getDeployedAppID(appId)
+ const prodAppId = getProdAppID(appId)
try {
- const prodDb = new CouchDB(prodAppId, { skip_setup: true })
+ // specific case, want to make sure setup is skipped
+ const prodDb = getProdAppDB({ skip_setup: true })
const info = await prodDb.info()
if (info.error) throw info.error
} catch (err) {
@@ -399,7 +406,7 @@ exports.sync = async (ctx, next) => {
}
// sync the users
- await syncGlobalUsers(appId)
+ await syncGlobalUsers()
if (error) {
ctx.throw(400, error)
@@ -411,7 +418,7 @@ exports.sync = async (ctx, next) => {
}
const updateAppPackage = async (appPackage, appId) => {
- const db = new CouchDB(appId)
+ const db = getAppDB()
const application = await db.get(DocumentTypes.APP_METADATA)
const newAppPackage = { ...application, ...appPackage }
@@ -430,7 +437,7 @@ const updateAppPackage = async (appPackage, appId) => {
}
const createEmptyAppPackage = async (ctx, app) => {
- const db = new CouchDB(app.appId)
+ const db = getAppDB()
let screensAndLayouts = []
for (let layout of BASE_LAYOUTS) {
diff --git a/packages/server/src/api/controllers/auth.js b/packages/server/src/api/controllers/auth.js
index f1b665c069..30c0e5d09c 100644
--- a/packages/server/src/api/controllers/auth.js
+++ b/packages/server/src/api/controllers/auth.js
@@ -1,11 +1,10 @@
-const CouchDB = require("../../db")
const { outputProcessing } = require("../../utilities/rowProcessor")
const { InternalTables } = require("../../db/utils")
const { getFullUser } = require("../../utilities/users")
const { BUILTIN_ROLE_IDS } = require("@budibase/backend-core/roles")
+const { getAppDB, getAppId } = require("@budibase/backend-core/context")
exports.fetchSelf = async ctx => {
- const appId = ctx.appId
let userId = ctx.user.userId || ctx.user._id
/* istanbul ignore next */
if (!userId) {
@@ -19,8 +18,8 @@ exports.fetchSelf = async ctx => {
// forward the csrf token from the session
user.csrfToken = ctx.user.csrfToken
- if (appId) {
- const db = new CouchDB(appId)
+ if (getAppId()) {
+ const db = getAppDB()
// remove the full roles structure
delete user.roles
try {
@@ -29,7 +28,7 @@ exports.fetchSelf = async ctx => {
// make sure there is never a stale csrf token
delete metadata.csrfToken
// specifically needs to make sure is enriched
- ctx.body = await outputProcessing(ctx, userTable, {
+ ctx.body = await outputProcessing(userTable, {
...user,
...metadata,
})
diff --git a/packages/server/src/api/controllers/automation.js b/packages/server/src/api/controllers/automation.js
index 05337579a0..74942dad40 100644
--- a/packages/server/src/api/controllers/automation.js
+++ b/packages/server/src/api/controllers/automation.js
@@ -1,4 +1,3 @@
-const CouchDB = require("../../db")
const actions = require("../../automations/actions")
const triggers = require("../../automations/triggers")
const { getAutomationParams, generateAutomationID } = require("../../db/utils")
@@ -10,6 +9,7 @@ const {
const { deleteEntityMetadata } = require("../../utilities")
const { MetadataTypes } = require("../../constants")
const { setTestFlag, clearTestFlag } = require("../../utilities/redis")
+const { getAppDB } = require("@budibase/backend-core/context")
const ACTION_DEFS = removeDeprecated(actions.ACTION_DEFINITIONS)
const TRIGGER_DEFS = removeDeprecated(triggers.TRIGGER_DEFINITIONS)
@@ -20,14 +20,9 @@ const TRIGGER_DEFS = removeDeprecated(triggers.TRIGGER_DEFINITIONS)
* *
*************************/
-async function cleanupAutomationMetadata(appId, automationId) {
+async function cleanupAutomationMetadata(automationId) {
+ await deleteEntityMetadata(MetadataTypes.AUTOMATION_TEST_INPUT, automationId)
await deleteEntityMetadata(
- appId,
- MetadataTypes.AUTOMATION_TEST_INPUT,
- automationId
- )
- await deleteEntityMetadata(
- appId,
MetadataTypes.AUTOMATION_TEST_HISTORY,
automationId
)
@@ -58,7 +53,7 @@ function cleanAutomationInputs(automation) {
}
exports.create = async function (ctx) {
- const db = new CouchDB(ctx.appId)
+ const db = getAppDB()
let automation = ctx.request.body
automation.appId = ctx.appId
@@ -72,7 +67,6 @@ exports.create = async function (ctx) {
automation.type = "automation"
automation = cleanAutomationInputs(automation)
automation = await checkForWebhooks({
- appId: ctx.appId,
newAuto: automation,
})
const response = await db.put(automation)
@@ -89,13 +83,12 @@ exports.create = async function (ctx) {
}
exports.update = async function (ctx) {
- const db = new CouchDB(ctx.appId)
+ const db = getAppDB()
let automation = ctx.request.body
automation.appId = ctx.appId
const oldAutomation = await db.get(automation._id)
automation = cleanAutomationInputs(automation)
automation = await checkForWebhooks({
- appId: ctx.appId,
oldAuto: oldAutomation,
newAuto: automation,
})
@@ -131,7 +124,7 @@ exports.update = async function (ctx) {
}
exports.fetch = async function (ctx) {
- const db = new CouchDB(ctx.appId)
+ const db = getAppDB()
const response = await db.allDocs(
getAutomationParams(null, {
include_docs: true,
@@ -141,20 +134,19 @@ exports.fetch = async function (ctx) {
}
exports.find = async function (ctx) {
- const db = new CouchDB(ctx.appId)
+ const db = getAppDB()
ctx.body = await db.get(ctx.params.id)
}
exports.destroy = async function (ctx) {
- const db = new CouchDB(ctx.appId)
+ const db = getAppDB()
const automationId = ctx.params.id
const oldAutomation = await db.get(automationId)
await checkForWebhooks({
- appId: ctx.appId,
oldAuto: oldAutomation,
})
// delete metadata first
- await cleanupAutomationMetadata(ctx.appId, automationId)
+ await cleanupAutomationMetadata(automationId)
ctx.body = await db.remove(automationId, ctx.params.rev)
}
@@ -180,12 +172,11 @@ module.exports.getDefinitionList = async function (ctx) {
*********************/
exports.trigger = async function (ctx) {
- const appId = ctx.appId
- const db = new CouchDB(appId)
+ const db = getAppDB()
let automation = await db.get(ctx.params.id)
await triggers.externalTrigger(automation, {
...ctx.request.body,
- appId,
+ appId: ctx.appId,
})
ctx.body = {
message: `Automation ${automation._id} has been triggered.`,
@@ -205,8 +196,7 @@ function prepareTestInput(input) {
}
exports.test = async function (ctx) {
- const appId = ctx.appId
- const db = new CouchDB(appId)
+ const db = getAppDB()
let automation = await db.get(ctx.params.id)
await setTestFlag(automation._id)
const testInput = prepareTestInput(ctx.request.body)
@@ -214,7 +204,7 @@ exports.test = async function (ctx) {
automation,
{
...testInput,
- appId,
+ appId: ctx.appId,
},
{ getResponses: true }
)
diff --git a/packages/server/src/api/controllers/cloud.js b/packages/server/src/api/controllers/cloud.js
index ea6cc9b71e..38804f4d4a 100644
--- a/packages/server/src/api/controllers/cloud.js
+++ b/packages/server/src/api/controllers/cloud.js
@@ -1,6 +1,5 @@
const env = require("../../environment")
const { getAllApps } = require("@budibase/backend-core/db")
-const CouchDB = require("../../db")
const {
exportDB,
sendTempFile,
@@ -30,7 +29,7 @@ exports.exportApps = async ctx => {
if (env.SELF_HOSTED || !env.MULTI_TENANCY) {
ctx.throw(400, "Exporting only allowed in multi-tenant cloud environments.")
}
- const apps = await getAllApps(CouchDB, { all: true })
+ const apps = await getAllApps({ all: true })
const globalDBString = await exportDB(getGlobalDBName(), {
filter: doc => !doc._id.startsWith(DocumentTypes.USER),
})
@@ -63,7 +62,7 @@ async function hasBeenImported() {
if (!env.SELF_HOSTED || env.MULTI_TENANCY) {
return true
}
- const apps = await getAllApps(CouchDB, { all: true })
+ const apps = await getAllApps({ all: true })
return apps.length !== 0
}
diff --git a/packages/server/src/api/controllers/component.js b/packages/server/src/api/controllers/component.js
index 06cb2cd211..2d0aaea23a 100644
--- a/packages/server/src/api/controllers/component.js
+++ b/packages/server/src/api/controllers/component.js
@@ -1,15 +1,14 @@
-const CouchDB = require("../../db")
const { DocumentTypes } = require("../../db/utils")
const { getComponentLibraryManifest } = require("../../utilities/fileSystem")
+const { getAppDB } = require("@budibase/backend-core/context")
exports.fetchAppComponentDefinitions = async function (ctx) {
- const appId = ctx.params.appId || ctx.appId
- const db = new CouchDB(appId)
+ const db = getAppDB()
const app = await db.get(DocumentTypes.APP_METADATA)
let componentManifests = await Promise.all(
app.componentLibraries.map(async library => {
- let manifest = await getComponentLibraryManifest(appId, library)
+ let manifest = await getComponentLibraryManifest(library)
return {
manifest,
diff --git a/packages/server/src/api/controllers/datasource.js b/packages/server/src/api/controllers/datasource.js
index 5ab3c0a865..999f322563 100644
--- a/packages/server/src/api/controllers/datasource.js
+++ b/packages/server/src/api/controllers/datasource.js
@@ -1,4 +1,3 @@
-const CouchDB = require("../../db")
const {
generateDatasourceID,
getDatasourceParams,
@@ -11,12 +10,11 @@ const { BuildSchemaErrors, InvalidColumns } = require("../../constants")
const { integrations } = require("../../integrations")
const { getDatasourceAndQuery } = require("./row/utils")
const { invalidateDynamicVariables } = require("../../threads/utils")
+const { getAppDB } = require("@budibase/backend-core/context")
exports.fetch = async function (ctx) {
- const database = new CouchDB(ctx.appId)
-
// Get internal tables
- const db = new CouchDB(ctx.appId)
+ const db = getAppDB()
const internalTables = await db.allDocs(
getTableParams(null, {
include_docs: true,
@@ -31,7 +29,7 @@ exports.fetch = async function (ctx) {
// Get external datasources
const datasources = (
- await database.allDocs(
+ await db.allDocs(
getDatasourceParams(null, {
include_docs: true,
})
@@ -49,7 +47,7 @@ exports.fetch = async function (ctx) {
}
exports.buildSchemaFromDb = async function (ctx) {
- const db = new CouchDB(ctx.appId)
+ const db = getAppDB()
const datasource = await db.get(ctx.params.datasourceId)
const { tables, error } = await buildSchemaHelper(datasource)
@@ -98,7 +96,7 @@ const invalidateVariables = async (existingDatasource, updatedDatasource) => {
}
exports.update = async function (ctx) {
- const db = new CouchDB(ctx.appId)
+ const db = getAppDB()
const datasourceId = ctx.params.datasourceId
let datasource = await db.get(datasourceId)
const auth = datasource.config.auth
@@ -126,7 +124,7 @@ exports.update = async function (ctx) {
}
exports.save = async function (ctx) {
- const db = new CouchDB(ctx.appId)
+ const db = getAppDB()
const plus = ctx.request.body.datasource.plus
const fetchSchema = ctx.request.body.fetchSchema
@@ -162,7 +160,7 @@ exports.save = async function (ctx) {
}
exports.destroy = async function (ctx) {
- const db = new CouchDB(ctx.appId)
+ const db = getAppDB()
// Delete all queries for the datasource
const queries = await db.allDocs(
@@ -184,7 +182,7 @@ exports.destroy = async function (ctx) {
}
exports.find = async function (ctx) {
- const database = new CouchDB(ctx.appId)
+ const database = getAppDB()
ctx.body = await database.get(ctx.params.datasourceId)
}
@@ -192,7 +190,7 @@ exports.find = async function (ctx) {
exports.query = async function (ctx) {
const queryJson = ctx.request.body
try {
- ctx.body = await getDatasourceAndQuery(ctx.appId, queryJson)
+ ctx.body = await getDatasourceAndQuery(queryJson)
} catch (err) {
ctx.throw(400, err)
}
diff --git a/packages/server/src/api/controllers/deploy/Deployment.js b/packages/server/src/api/controllers/deploy/Deployment.js
index b398aa2e6d..65cca97d07 100644
--- a/packages/server/src/api/controllers/deploy/Deployment.js
+++ b/packages/server/src/api/controllers/deploy/Deployment.js
@@ -1,18 +1,14 @@
const newid = require("../../../db/newid")
+const { getAppId } = require("@budibase/backend-core/context")
/**
* This is used to pass around information about the deployment that is occurring
*/
class Deployment {
- constructor(appId, id = null) {
- this.appId = appId
+ constructor(id = null) {
this._id = id || newid()
}
- getAppId() {
- return this.appId
- }
-
setVerification(verification) {
if (!verification) {
return
@@ -43,7 +39,7 @@ class Deployment {
getJSON() {
const obj = {
_id: this._id,
- appId: this.appId,
+ appId: getAppId(),
status: this.status,
}
if (this.err) {
diff --git a/packages/server/src/api/controllers/deploy/index.js b/packages/server/src/api/controllers/deploy/index.js
index 76d7b75912..4186a192a4 100644
--- a/packages/server/src/api/controllers/deploy/index.js
+++ b/packages/server/src/api/controllers/deploy/index.js
@@ -1,12 +1,20 @@
-const CouchDB = require("../../../db")
const Deployment = require("./Deployment")
-const { Replication, getDeployedAppID } = require("@budibase/backend-core/db")
+const {
+ Replication,
+ getProdAppID,
+ getDevelopmentAppID,
+} = require("@budibase/backend-core/db")
const { DocumentTypes, getAutomationParams } = require("../../../db/utils")
const {
disableAllCrons,
enableCronTrigger,
} = require("../../../automations/utils")
const { app: appCache } = require("@budibase/backend-core/cache")
+const {
+ getAppId,
+ getAppDB,
+ getProdAppDB,
+} = require("@budibase/backend-core/context")
// the max time we can wait for an invalidation to complete before considering it failed
const MAX_PENDING_TIME_MS = 30 * 60000
@@ -34,9 +42,8 @@ async function checkAllDeployments(deployments) {
}
async function storeDeploymentHistory(deployment) {
- const appId = deployment.getAppId()
const deploymentJSON = deployment.getJSON()
- const db = new CouchDB(appId)
+ const db = getAppDB()
let deploymentDoc
try {
@@ -64,7 +71,7 @@ async function storeDeploymentHistory(deployment) {
}
async function initDeployedApp(prodAppId) {
- const db = new CouchDB(prodAppId)
+ const db = getProdAppDB()
console.log("Reading automation docs")
const automations = (
await db.allDocs(
@@ -88,10 +95,12 @@ async function initDeployedApp(prodAppId) {
async function deployApp(deployment) {
try {
- const productionAppId = getDeployedAppID(deployment.appId)
+ const appId = getAppId()
+ const devAppId = getDevelopmentAppID(appId)
+ const productionAppId = getProdAppID(appId)
const replication = new Replication({
- source: deployment.appId,
+ source: devAppId,
target: productionAppId,
})
@@ -99,7 +108,7 @@ async function deployApp(deployment) {
await replication.replicate()
console.log("replication complete.. replacing app meta doc")
- const db = new CouchDB(productionAppId)
+ const db = getProdAppDB()
const appDoc = await db.get(DocumentTypes.APP_METADATA)
appDoc.appId = productionAppId
appDoc.instance._id = productionAppId
@@ -122,8 +131,7 @@ async function deployApp(deployment) {
exports.fetchDeployments = async function (ctx) {
try {
- const appId = ctx.appId
- const db = new CouchDB(appId)
+ const db = getAppDB()
const deploymentDoc = await db.get(DocumentTypes.DEPLOYMENTS)
const { updated, deployments } = await checkAllDeployments(
deploymentDoc,
@@ -140,8 +148,7 @@ exports.fetchDeployments = async function (ctx) {
exports.deploymentProgress = async function (ctx) {
try {
- const appId = ctx.appId
- const db = new CouchDB(appId)
+ const db = getAppDB()
const deploymentDoc = await db.get(DocumentTypes.DEPLOYMENTS)
ctx.body = deploymentDoc[ctx.params.deploymentId]
} catch (err) {
@@ -153,7 +160,7 @@ exports.deploymentProgress = async function (ctx) {
}
exports.deployApp = async function (ctx) {
- let deployment = new Deployment(ctx.appId)
+ let deployment = new Deployment()
console.log("Deployment object created")
deployment.setStatus(DeploymentStatus.PENDING)
console.log("Deployment object set to pending")
diff --git a/packages/server/src/api/controllers/dev.js b/packages/server/src/api/controllers/dev.js
index 3126454a6b..bec9478245 100644
--- a/packages/server/src/api/controllers/dev.js
+++ b/packages/server/src/api/controllers/dev.js
@@ -1,12 +1,12 @@
const fetch = require("node-fetch")
-const CouchDB = require("../../db")
const env = require("../../environment")
const { checkSlashesInUrl } = require("../../utilities")
const { request } = require("../../utilities/workerRequests")
const { clearLock } = require("../../utilities/redis")
-const { Replication } = require("@budibase/backend-core/db")
+const { Replication, getProdAppID } = require("@budibase/backend-core/db")
const { DocumentTypes } = require("../../db/utils")
const { app: appCache } = require("@budibase/backend-core/cache")
+const { getProdAppDB, getAppDB } = require("@budibase/backend-core/context")
async function redirect(ctx, method, path = "global") {
const { devPath } = ctx.params
@@ -77,11 +77,11 @@ exports.clearLock = async ctx => {
exports.revert = async ctx => {
const { appId } = ctx.params
- const productionAppId = appId.replace("_dev", "")
+ const productionAppId = getProdAppID(appId)
// App must have been deployed first
try {
- const db = new CouchDB(productionAppId, { skip_setup: true })
+ const db = getProdAppDB({ skip_setup: true })
const info = await db.info()
if (info.error) throw info.error
const deploymentDoc = await db.get(DocumentTypes.DEPLOYMENTS)
@@ -103,7 +103,7 @@ exports.revert = async ctx => {
await replication.rollback()
// update appID in reverted app to be dev version again
- const db = new CouchDB(appId)
+ const db = getAppDB()
const appDoc = await db.get(DocumentTypes.APP_METADATA)
appDoc.appId = appId
appDoc.instance._id = appId
diff --git a/packages/server/src/api/controllers/layout.js b/packages/server/src/api/controllers/layout.js
index c3cae1b6a7..a92eec424a 100644
--- a/packages/server/src/api/controllers/layout.js
+++ b/packages/server/src/api/controllers/layout.js
@@ -2,11 +2,11 @@ const {
EMPTY_LAYOUT,
BASE_LAYOUT_PROP_IDS,
} = require("../../constants/layouts")
-const CouchDB = require("../../db")
const { generateLayoutID, getScreenParams } = require("../../db/utils")
+const { getAppDB } = require("@budibase/backend-core/context")
exports.save = async function (ctx) {
- const db = new CouchDB(ctx.appId)
+ const db = getAppDB()
let layout = ctx.request.body
if (!layout.props) {
@@ -26,7 +26,7 @@ exports.save = async function (ctx) {
}
exports.destroy = async function (ctx) {
- const db = new CouchDB(ctx.appId)
+ const db = getAppDB()
const layoutId = ctx.params.layoutId,
layoutRev = ctx.params.layoutRev
diff --git a/packages/server/src/api/controllers/metadata.js b/packages/server/src/api/controllers/metadata.js
index 75236650fd..e68db9b003 100644
--- a/packages/server/src/api/controllers/metadata.js
+++ b/packages/server/src/api/controllers/metadata.js
@@ -1,7 +1,7 @@
const { MetadataTypes } = require("../../constants")
-const CouchDB = require("../../db")
const { generateMetadataID } = require("../../db/utils")
const { saveEntityMetadata, deleteEntityMetadata } = require("../../utilities")
+const { getAppDB } = require("@budibase/backend-core/context")
exports.getTypes = async ctx => {
ctx.body = {
@@ -14,17 +14,12 @@ exports.saveMetadata = async ctx => {
if (type === MetadataTypes.AUTOMATION_TEST_HISTORY) {
ctx.throw(400, "Cannot save automation history type")
}
- ctx.body = await saveEntityMetadata(
- ctx.appId,
- type,
- entityId,
- ctx.request.body
- )
+ ctx.body = await saveEntityMetadata(type, entityId, ctx.request.body)
}
exports.deleteMetadata = async ctx => {
const { type, entityId } = ctx.params
- await deleteEntityMetadata(ctx.appId, type, entityId)
+ await deleteEntityMetadata(type, entityId)
ctx.body = {
message: "Metadata deleted successfully",
}
@@ -32,7 +27,7 @@ exports.deleteMetadata = async ctx => {
exports.getMetadata = async ctx => {
const { type, entityId } = ctx.params
- const db = new CouchDB(ctx.appId)
+ const db = getAppDB()
const id = generateMetadataID(type, entityId)
try {
ctx.body = await db.get(id)
diff --git a/packages/server/src/api/controllers/permission.js b/packages/server/src/api/controllers/permission.js
index 5c42fe77ef..0e37a3e7d3 100644
--- a/packages/server/src/api/controllers/permission.js
+++ b/packages/server/src/api/controllers/permission.js
@@ -6,12 +6,12 @@ const {
getBuiltinRoles,
} = require("@budibase/backend-core/roles")
const { getRoleParams } = require("../../db/utils")
-const CouchDB = require("../../db")
const {
CURRENTLY_SUPPORTED_LEVELS,
getBasePermissions,
} = require("../../utilities/security")
const { removeFromArray } = require("../../utilities")
+const { getAppDB } = require("@budibase/backend-core/context")
const PermissionUpdateType = {
REMOVE: "remove",
@@ -35,7 +35,7 @@ async function updatePermissionOnRole(
{ roleId, resourceId, level },
updateType
) {
- const db = new CouchDB(appId)
+ const db = getAppDB()
const remove = updateType === PermissionUpdateType.REMOVE
const isABuiltin = isBuiltin(roleId)
const dbRoleId = getDBRoleID(roleId)
@@ -106,7 +106,7 @@ exports.fetchLevels = function (ctx) {
}
exports.fetch = async function (ctx) {
- const db = new CouchDB(ctx.appId)
+ const db = getAppDB()
const roles = await getAllDBRoles(db)
let permissions = {}
// create an object with structure role ID -> resource ID -> level
@@ -133,7 +133,7 @@ exports.fetch = async function (ctx) {
exports.getResourcePerms = async function (ctx) {
const resourceId = ctx.params.resourceId
- const db = new CouchDB(ctx.appId)
+ const db = getAppDB()
const body = await db.allDocs(
getRoleParams(null, {
include_docs: true,
diff --git a/packages/server/src/api/controllers/query/import/index.ts b/packages/server/src/api/controllers/query/import/index.ts
index 933d6b101c..593fb05fd3 100644
--- a/packages/server/src/api/controllers/query/import/index.ts
+++ b/packages/server/src/api/controllers/query/import/index.ts
@@ -1,10 +1,11 @@
-import CouchDB from "../../../../db"
import { queryValidation } from "../validation"
import { generateQueryID } from "../../../../db/utils"
import { ImportInfo, ImportSource } from "./sources/base"
import { OpenAPI2 } from "./sources/openapi2"
import { Query } from "./../../../../definitions/common"
import { Curl } from "./sources/curl"
+// @ts-ignore
+import { getAppDB } from "@budibase/backend-core/context"
interface ImportResult {
errorQueries: Query[]
queries: Query[]
@@ -33,10 +34,7 @@ export class RestImporter {
return this.source.getInfo()
}
- importQueries = async (
- appId: string,
- datasourceId: string
- ): Promise => {
+ importQueries = async (datasourceId: string): Promise => {
// constuct the queries
let queries = await this.source.getQueries(datasourceId)
@@ -58,7 +56,7 @@ export class RestImporter {
})
// persist queries
- const db = new CouchDB(appId)
+ const db = getAppDB()
const response = await db.bulkDocs(queries)
// create index to seperate queries and errors
diff --git a/packages/server/src/api/controllers/query/import/tests/index.spec.js b/packages/server/src/api/controllers/query/import/tests/index.spec.js
index 5a509d2258..8d074ea885 100644
--- a/packages/server/src/api/controllers/query/import/tests/index.spec.js
+++ b/packages/server/src/api/controllers/query/import/tests/index.spec.js
@@ -6,6 +6,7 @@ const db = jest.fn(() => {
}
})
jest.mock("../../../../../db", () => db)
+require("@budibase/backend-core").init(require("../../../../../db"))
const { RestImporter } = require("../index")
@@ -77,7 +78,7 @@ describe("Rest Importer", () => {
const testImportQueries = async (key, data, assertions) => {
await init(data)
bulkDocs.mockReturnValue([])
- const importResult = await restImporter.importQueries("appId", "datasourceId")
+ const importResult = await restImporter.importQueries("datasourceId")
expect(importResult.errorQueries.length).toBe(0)
expect(importResult.queries.length).toBe(assertions[key].count)
expect(bulkDocs).toHaveBeenCalledTimes(1)
diff --git a/packages/server/src/api/controllers/query/index.js b/packages/server/src/api/controllers/query/index.js
index 9cf7612e8a..7a179bab35 100644
--- a/packages/server/src/api/controllers/query/index.js
+++ b/packages/server/src/api/controllers/query/index.js
@@ -1,4 +1,3 @@
-const CouchDB = require("../../../db")
const {
generateQueryID,
getQueryParams,
@@ -10,6 +9,7 @@ const { save: saveDatasource } = require("../datasource")
const { RestImporter } = require("./import")
const { invalidateDynamicVariables } = require("../../../threads/utils")
const environment = require("../../../environment")
+const { getAppDB } = require("@budibase/backend-core/context")
const Runner = new Thread(ThreadType.QUERY, {
timeoutMs: environment.QUERY_THREAD_TIMEOUT || 10000,
@@ -28,7 +28,7 @@ function enrichQueries(input) {
}
exports.fetch = async function (ctx) {
- const db = new CouchDB(ctx.appId)
+ const db = getAppDB()
const body = await db.allDocs(
getQueryParams(null, {
@@ -69,7 +69,7 @@ exports.import = async ctx => {
datasourceId = body.datasourceId
}
- const importResult = await importer.importQueries(ctx.appId, datasourceId)
+ const importResult = await importer.importQueries(datasourceId)
ctx.body = {
...importResult,
@@ -79,7 +79,7 @@ exports.import = async ctx => {
}
exports.save = async function (ctx) {
- const db = new CouchDB(ctx.appId)
+ const db = getAppDB()
const query = ctx.request.body
if (!query._id) {
@@ -94,7 +94,7 @@ exports.save = async function (ctx) {
}
exports.find = async function (ctx) {
- const db = new CouchDB(ctx.appId)
+ const db = getAppDB()
const query = enrichQueries(await db.get(ctx.params.queryId))
// remove properties that could be dangerous in real app
if (isProdAppID(ctx.appId)) {
@@ -105,7 +105,7 @@ exports.find = async function (ctx) {
}
exports.preview = async function (ctx) {
- const db = new CouchDB(ctx.appId)
+ const db = getAppDB()
const datasource = await db.get(ctx.request.body.datasourceId)
// preview may not have a queryId as it hasn't been saved, but if it does
@@ -136,7 +136,7 @@ exports.preview = async function (ctx) {
}
async function execute(ctx, opts = { rowsOnly: false }) {
- const db = new CouchDB(ctx.appId)
+ const db = getAppDB()
const query = await db.get(ctx.params.queryId)
const datasource = await db.get(query.datasourceId)
@@ -181,7 +181,8 @@ exports.executeV2 = async function (ctx) {
return execute(ctx, { rowsOnly: false })
}
-const removeDynamicVariables = async (db, queryId) => {
+const removeDynamicVariables = async queryId => {
+ const db = getAppDB()
const query = await db.get(queryId)
const datasource = await db.get(query.datasourceId)
const dynamicVariables = datasource.config.dynamicVariables
@@ -202,8 +203,8 @@ const removeDynamicVariables = async (db, queryId) => {
}
exports.destroy = async function (ctx) {
- const db = new CouchDB(ctx.appId)
- await removeDynamicVariables(db, ctx.params.queryId)
+ const db = getAppDB()
+ await removeDynamicVariables(ctx.params.queryId)
await db.remove(ctx.params.queryId, ctx.params.revId)
ctx.message = `Query deleted.`
ctx.status = 200
diff --git a/packages/server/src/api/controllers/role.js b/packages/server/src/api/controllers/role.js
index b79907031d..11b4b9a520 100644
--- a/packages/server/src/api/controllers/role.js
+++ b/packages/server/src/api/controllers/role.js
@@ -1,4 +1,3 @@
-const CouchDB = require("../../db")
const {
Role,
getRole,
@@ -10,6 +9,7 @@ const {
getUserMetadataParams,
InternalTables,
} = require("../../db/utils")
+const { getAppDB } = require("@budibase/backend-core/context")
const UpdateRolesOptions = {
CREATED: "created",
@@ -40,15 +40,15 @@ async function updateRolesOnUserTable(db, roleId, updateOption) {
}
exports.fetch = async function (ctx) {
- ctx.body = await getAllRoles(ctx.appId)
+ ctx.body = await getAllRoles()
}
exports.find = async function (ctx) {
- ctx.body = await getRole(ctx.appId, ctx.params.roleId)
+ ctx.body = await getRole(ctx.params.roleId)
}
exports.save = async function (ctx) {
- const db = new CouchDB(ctx.appId)
+ const db = getAppDB()
let { _id, name, inherits, permissionId } = ctx.request.body
if (!_id) {
_id = generateRoleID()
@@ -69,7 +69,7 @@ exports.save = async function (ctx) {
}
exports.destroy = async function (ctx) {
- const db = new CouchDB(ctx.appId)
+ const db = getAppDB()
const roleId = ctx.params.roleId
if (isBuiltin(roleId)) {
ctx.throw(400, "Cannot delete builtin role.")
diff --git a/packages/server/src/api/controllers/routing.js b/packages/server/src/api/controllers/routing.js
index d45d33ed07..d6ba9d6ac2 100644
--- a/packages/server/src/api/controllers/routing.js
+++ b/packages/server/src/api/controllers/routing.js
@@ -39,12 +39,11 @@ Routing.prototype.addScreenId = function (fullpath, roleId, screenId) {
/**
* Gets the full routing structure by querying the routing view and processing the result into the tree.
- * @param {string} appId The application to produce the routing structure for.
* @returns {Promise} The routing structure, this is the full structure designed for use in the builder,
* if the client routing is required then the updateRoutingStructureForUserRole should be used.
*/
-async function getRoutingStructure(appId) {
- const screenRoutes = await getRoutingInfo(appId)
+async function getRoutingStructure() {
+ const screenRoutes = await getRoutingInfo()
const routing = new Routing()
for (let screenRoute of screenRoutes) {
@@ -57,13 +56,13 @@ async function getRoutingStructure(appId) {
}
exports.fetch = async ctx => {
- ctx.body = await getRoutingStructure(ctx.appId)
+ ctx.body = await getRoutingStructure()
}
exports.clientFetch = async ctx => {
- const routing = await getRoutingStructure(ctx.appId)
+ const routing = await getRoutingStructure()
let roleId = ctx.user.role._id
- const roleIds = await getUserRoleHierarchy(ctx.appId, roleId)
+ const roleIds = await getUserRoleHierarchy(roleId)
for (let topLevel of Object.values(routing.routes)) {
for (let subpathKey of Object.keys(topLevel.subpaths)) {
let found = false
diff --git a/packages/server/src/api/controllers/row/ExternalRequest.ts b/packages/server/src/api/controllers/row/ExternalRequest.ts
index 0bffd134c1..6aa51fb36b 100644
--- a/packages/server/src/api/controllers/row/ExternalRequest.ts
+++ b/packages/server/src/api/controllers/row/ExternalRequest.ts
@@ -19,6 +19,19 @@ import {
isRowId,
convertRowId,
} from "../../../integrations/utils"
+import { getDatasourceAndQuery } from "./utils"
+import {
+ DataSourceOperation,
+ FieldTypes,
+ RelationshipTypes,
+} from "../../../constants"
+import { breakExternalTableId, isSQL } from "../../../integrations/utils"
+import { processObjectSync } from "@budibase/string-templates"
+// @ts-ignore
+import { cloneDeep } from "lodash/fp"
+import { processFormulas } from "../../../utilities/rowProcessor/utils"
+// @ts-ignore
+import { getAppDB } from "@budibase/backend-core/context"
interface ManyRelationship {
tableId?: string
@@ -38,18 +51,6 @@ interface RunConfig {
}
module External {
- const { getDatasourceAndQuery } = require("./utils")
- const {
- DataSourceOperation,
- FieldTypes,
- RelationshipTypes,
- } = require("../../../constants")
- const { breakExternalTableId, isSQL } = require("../../../integrations/utils")
- const { processObjectSync } = require("@budibase/string-templates")
- const { cloneDeep } = require("lodash/fp")
- const CouchDB = require("../../../db")
- const { processFormulas } = require("../../../utilities/rowProcessor/utils")
-
function buildFilters(
id: string | undefined,
filters: SearchFilters,
@@ -183,7 +184,7 @@ module External {
thisRow._id = generateIdForRow(row, table)
thisRow.tableId = table._id
thisRow._rev = "rev"
- return thisRow
+ return processFormulas(table, thisRow)
}
function fixArrayTypes(row: Row, table: Table) {
@@ -210,19 +211,12 @@ module External {
}
class ExternalRequest {
- private readonly appId: string
private operation: Operation
private tableId: string
private datasource: Datasource
private tables: { [key: string]: Table } = {}
- constructor(
- appId: string,
- operation: Operation,
- tableId: string,
- datasource: Datasource
- ) {
- this.appId = appId
+ constructor(operation: Operation, tableId: string, datasource: Datasource) {
this.operation = operation
this.tableId = tableId
this.datasource = datasource
@@ -231,12 +225,14 @@ module External {
}
}
- getTable(tableId: string | undefined): Table {
+ getTable(tableId: string | undefined): Table | undefined {
if (!tableId) {
throw "Table ID is unknown, cannot find table"
}
const { tableName } = breakExternalTableId(tableId)
- return this.tables[tableName]
+ if (tableName) {
+ return this.tables[tableName]
+ }
}
inputProcessing(row: Row | undefined, table: Table) {
@@ -272,9 +268,11 @@ module External {
newRow[key] = row[key]
continue
}
- const { tableName: linkTableName } = breakExternalTableId(field.tableId)
+ const { tableName: linkTableName } = breakExternalTableId(
+ field?.tableId
+ )
// table has to exist for many to many
- if (!this.tables[linkTableName]) {
+ if (!linkTableName || !this.tables[linkTableName]) {
continue
}
const linkTable = this.tables[linkTableName]
@@ -329,8 +327,12 @@ module External {
* This iterates through the returned rows and works out what elements of the rows
* actually match up to another row (based on primary keys) - this is pretty specific
* to SQL and the way that SQL relationships are returned based on joins.
+ * This is complicated, but the idea is that when a SQL query returns all the relations
+ * will be separate rows, with all of the data in each row. We have to decipher what comes
+ * from where (which tables) and how to convert that into budibase columns.
*/
updateRelationshipColumns(
+ table: Table,
row: Row,
rows: { [key: string]: Row },
relationships: RelationshipsJson[]
@@ -341,6 +343,13 @@ module External {
if (!linkedTable) {
continue
}
+ const fromColumn = `${table.name}.${relationship.from}`
+ const toColumn = `${linkedTable.name}.${relationship.to}`
+ // this is important when working with multiple relationships
+ // between the same tables, don't want to overlap/multiply the relations
+ if (!relationship.through && row[fromColumn] !== row[toColumn]) {
+ continue
+ }
let linked = basicProcessing(row, linkedTable)
if (!linked._id) {
continue
@@ -388,6 +397,7 @@ module External {
// this is a relationship of some sort
if (finalRows[rowId]) {
finalRows = this.updateRelationshipColumns(
+ table,
row,
finalRows,
relationships
@@ -401,6 +411,7 @@ module External {
finalRows[thisRow._id] = thisRow
// do this at end once its been added to the final rows
finalRows = this.updateRelationshipColumns(
+ table,
row,
finalRows,
relationships
@@ -422,7 +433,7 @@ module External {
}
const { tableName: linkTableName } = breakExternalTableId(field.tableId)
// no table to link to, this is not a valid relationships
- if (!this.tables[linkTableName]) {
+ if (!linkTableName || !this.tables[linkTableName]) {
continue
}
const linkTable = this.tables[linkTableName]
@@ -460,6 +471,9 @@ module External {
async lookupRelations(tableId: string, row: Row) {
const related: { [key: string]: any } = {}
const { tableName } = breakExternalTableId(tableId)
+ if (!tableName) {
+ return related
+ }
const table = this.tables[tableName]
// @ts-ignore
const primaryKey = table.primary[0]
@@ -484,7 +498,7 @@ module External {
if (!lookupField || !row[lookupField]) {
continue
}
- const response = await getDatasourceAndQuery(this.appId, {
+ const response = await getDatasourceAndQuery({
endpoint: getEndpoint(tableId, DataSourceOperation.READ),
filters: {
equal: {
@@ -515,28 +529,30 @@ module External {
row: Row,
relationships: ManyRelationship[]
) {
- const { appId } = this
// if we're creating (in a through table) need to wipe the existing ones first
const promises = []
const related = await this.lookupRelations(mainTableId, row)
for (let relationship of relationships) {
const { key, tableId, isUpdate, id, ...rest } = relationship
- const body = processObjectSync(rest, row)
+ const body: { [key: string]: any } = processObjectSync(rest, row, {})
const linkTable = this.getTable(tableId)
// @ts-ignore
- const linkPrimary = linkTable.primary[0]
- const rows = related[key]?.rows || []
+ const linkPrimary = linkTable?.primary[0]
+ if (!linkTable || !linkPrimary) {
+ return
+ }
+ const rows = related[key].rows || []
const found = rows.find(
(row: { [key: string]: any }) =>
row[linkPrimary] === relationship.id ||
- row[linkPrimary] === body[linkPrimary]
+ row[linkPrimary] === body?.[linkPrimary]
)
const operation = isUpdate
? DataSourceOperation.UPDATE
: DataSourceOperation.CREATE
if (!found) {
promises.push(
- getDatasourceAndQuery(appId, {
+ getDatasourceAndQuery({
endpoint: getEndpoint(tableId, operation),
// if we're doing many relationships then we're writing, only one response
body,
@@ -552,9 +568,12 @@ module External {
for (let [colName, { isMany, rows, tableId }] of Object.entries(
related
)) {
- const table: Table = this.getTable(tableId)
+ const table: Table | undefined = this.getTable(tableId)
// if its not the foreign key skip it, nothing to do
- if (table.primary && table.primary.indexOf(colName) !== -1) {
+ if (
+ !table ||
+ (table.primary && table.primary.indexOf(colName) !== -1)
+ ) {
continue
}
for (let row of rows) {
@@ -566,7 +585,7 @@ module External {
: DataSourceOperation.UPDATE
const body = isMany ? null : { [colName]: null }
promises.push(
- getDatasourceAndQuery(this.appId, {
+ getDatasourceAndQuery({
endpoint: getEndpoint(tableId, op),
body,
filters,
@@ -605,20 +624,25 @@ module External {
continue
}
const { tableName: linkTableName } = breakExternalTableId(field.tableId)
- const linkTable = this.tables[linkTableName]
- if (linkTable) {
- const linkedFields = extractRealFields(linkTable, fields)
- fields = fields.concat(linkedFields)
+ if (linkTableName) {
+ const linkTable = this.tables[linkTableName]
+ if (linkTable) {
+ const linkedFields = extractRealFields(linkTable, fields)
+ fields = fields.concat(linkedFields)
+ }
}
}
return fields
}
async run(config: RunConfig) {
- const { appId, operation, tableId } = this
+ const { operation, tableId } = this
let { datasourceId, tableName } = breakExternalTableId(tableId)
+ if (!tableName) {
+ throw "Unable to run without a table name"
+ }
if (!this.datasource) {
- const db = new CouchDB(appId)
+ const db = getAppDB()
this.datasource = await db.get(datasourceId)
if (!this.datasource || !this.datasource.entities) {
throw "No tables found, fetch tables before query."
@@ -670,7 +694,7 @@ module External {
},
}
// can't really use response right now
- const response = await getDatasourceAndQuery(appId, json)
+ const response = await getDatasourceAndQuery(json)
// handle many to many relationships now if we know the ID (could be auto increment)
if (
operation !== DataSourceOperation.READ &&
diff --git a/packages/server/src/api/controllers/row/external.js b/packages/server/src/api/controllers/row/external.js
index b8620f7bc3..66a1e30ca6 100644
--- a/packages/server/src/api/controllers/row/external.js
+++ b/packages/server/src/api/controllers/row/external.js
@@ -9,9 +9,9 @@ const {
breakRowIdField,
} = require("../../../integrations/utils")
const ExternalRequest = require("./ExternalRequest")
-const CouchDB = require("../../../db")
+const { getAppDB } = require("@budibase/backend-core/context")
-async function handleRequest(appId, operation, tableId, opts = {}) {
+async function handleRequest(operation, tableId, opts = {}) {
// make sure the filters are cleaned up, no empty strings for equals, fuzzy or string
if (opts && opts.filters) {
for (let filterField of NoEmptyFilterStrings) {
@@ -25,31 +25,27 @@ async function handleRequest(appId, operation, tableId, opts = {}) {
}
}
}
- return new ExternalRequest(appId, operation, tableId, opts.datasource).run(
- opts
- )
+ return new ExternalRequest(operation, tableId, opts.datasource).run(opts)
}
exports.handleRequest = handleRequest
exports.patch = async ctx => {
- const appId = ctx.appId
const inputs = ctx.request.body
const tableId = ctx.params.tableId
const id = breakRowIdField(inputs._id)
// don't save the ID to db
delete inputs._id
- return handleRequest(appId, DataSourceOperation.UPDATE, tableId, {
+ return handleRequest(DataSourceOperation.UPDATE, tableId, {
id,
row: inputs,
})
}
exports.save = async ctx => {
- const appId = ctx.appId
const inputs = ctx.request.body
const tableId = ctx.params.tableId
- return handleRequest(appId, DataSourceOperation.CREATE, tableId, {
+ return handleRequest(DataSourceOperation.CREATE, tableId, {
row: inputs,
})
}
@@ -63,49 +59,35 @@ exports.fetchView = async ctx => {
}
exports.fetch = async ctx => {
- const appId = ctx.appId
const tableId = ctx.params.tableId
- return handleRequest(appId, DataSourceOperation.READ, tableId)
+ return handleRequest(DataSourceOperation.READ, tableId)
}
exports.find = async ctx => {
- const appId = ctx.appId
const id = ctx.params.rowId
const tableId = ctx.params.tableId
- const response = await handleRequest(
- appId,
- DataSourceOperation.READ,
- tableId,
- {
- id,
- }
- )
+ const response = await handleRequest(DataSourceOperation.READ, tableId, {
+ id,
+ })
return response ? response[0] : response
}
exports.destroy = async ctx => {
- const appId = ctx.appId
const tableId = ctx.params.tableId
const id = ctx.request.body._id
- const { row } = await handleRequest(
- appId,
- DataSourceOperation.DELETE,
- tableId,
- {
- id,
- }
- )
+ const { row } = await handleRequest(DataSourceOperation.DELETE, tableId, {
+ id,
+ })
return { response: { ok: true }, row }
}
exports.bulkDestroy = async ctx => {
- const appId = ctx.appId
const { rows } = ctx.request.body
const tableId = ctx.params.tableId
let promises = []
for (let row of rows) {
promises.push(
- handleRequest(appId, DataSourceOperation.DELETE, tableId, {
+ handleRequest(DataSourceOperation.DELETE, tableId, {
id: breakRowIdField(row._id),
})
)
@@ -115,7 +97,6 @@ exports.bulkDestroy = async ctx => {
}
exports.search = async ctx => {
- const appId = ctx.appId
const tableId = ctx.params.tableId
const { paginate, query, ...params } = ctx.request.body
let { bookmark, limit } = params
@@ -145,26 +126,21 @@ exports.search = async ctx => {
[params.sort]: direction,
}
}
- const rows = await handleRequest(appId, DataSourceOperation.READ, tableId, {
+ const rows = await handleRequest(DataSourceOperation.READ, tableId, {
filters: query,
sort,
paginate: paginateObj,
})
let hasNextPage = false
if (paginate && rows.length === limit) {
- const nextRows = await handleRequest(
- appId,
- DataSourceOperation.READ,
- tableId,
- {
- filters: query,
- sort,
- paginate: {
- limit: 1,
- page: bookmark * limit + 1,
- },
- }
- )
+ const nextRows = await handleRequest(DataSourceOperation.READ, tableId, {
+ filters: query,
+ sort,
+ paginate: {
+ limit: 1,
+ page: bookmark * limit + 1,
+ },
+ })
hasNextPage = nextRows.length > 0
}
// need wrapper object for bookmarks etc when paginating
@@ -177,25 +153,19 @@ exports.validate = async () => {
}
exports.fetchEnrichedRow = async ctx => {
- const appId = ctx.appId
const id = ctx.params.rowId
const tableId = ctx.params.tableId
const { datasourceId, tableName } = breakExternalTableId(tableId)
- const db = new CouchDB(appId)
+ const db = getAppDB()
const datasource = await db.get(datasourceId)
if (!datasource || !datasource.entities) {
ctx.throw(400, "Datasource has not been configured for plus API.")
}
const tables = datasource.entities
- const response = await handleRequest(
- appId,
- DataSourceOperation.READ,
- tableId,
- {
- id,
- datasource,
- }
- )
+ const response = await handleRequest(DataSourceOperation.READ, tableId, {
+ id,
+ datasource,
+ })
const table = tables[tableName]
const row = response[0]
// this seems like a lot of work, but basically we need to dig deeper for the enrich
@@ -214,7 +184,6 @@ exports.fetchEnrichedRow = async ctx => {
// don't support composite keys right now
const linkedIds = links.map(link => breakRowIdField(link._id)[0])
row[fieldName] = await handleRequest(
- appId,
DataSourceOperation.READ,
linkedTableId,
{
diff --git a/packages/server/src/api/controllers/row/internal.js b/packages/server/src/api/controllers/row/internal.js
index 0e9c2e651d..e1ea32e557 100644
--- a/packages/server/src/api/controllers/row/internal.js
+++ b/packages/server/src/api/controllers/row/internal.js
@@ -1,4 +1,3 @@
-const CouchDB = require("../../../db")
const linkRows = require("../../../db/linkedRows")
const {
generateRowID,
@@ -25,6 +24,7 @@ const {
getFromMemoryDoc,
} = require("../view/utils")
const { cloneDeep } = require("lodash/fp")
+const { getAppDB } = require("@budibase/backend-core/context")
const { finaliseRow, updateRelatedFormula } = require("./staticFormula")
const CALCULATION_TYPES = {
@@ -76,8 +76,7 @@ async function getRawTableData(ctx, db, tableId) {
}
exports.patch = async ctx => {
- const appId = ctx.appId
- const db = new CouchDB(appId)
+ const db = getAppDB()
const inputs = ctx.request.body
const tableId = inputs.tableId
const isUserTable = tableId === InternalTables.USER_METADATA
@@ -116,14 +115,13 @@ exports.patch = async ctx => {
// returned row is cleaned and prepared for writing to DB
row = await linkRows.updateLinks({
- appId,
eventType: linkRows.EventType.ROW_UPDATE,
row,
tableId: row.tableId,
table,
})
// check if any attachments removed
- await cleanupAttachments(appId, table, { oldRow, row })
+ await cleanupAttachments(table, { oldRow, row })
if (isUserTable) {
// the row has been updated, need to put it into the ctx
@@ -132,15 +130,14 @@ exports.patch = async ctx => {
return { row: ctx.body, table }
}
- return finaliseRow(ctx.appId, table, row, {
+ return finaliseRow(table, row, {
oldTable: dbTable,
updateFormula: true,
})
}
exports.save = async function (ctx) {
- const appId = ctx.appId
- const db = new CouchDB(appId)
+ const db = getAppDB()
let inputs = ctx.request.body
inputs.tableId = ctx.params.tableId
@@ -162,21 +159,19 @@ exports.save = async function (ctx) {
// make sure link rows are up to date
row = await linkRows.updateLinks({
- appId,
eventType: linkRows.EventType.ROW_SAVE,
row,
tableId: row.tableId,
table,
})
- return finaliseRow(ctx.appId, table, row, {
+ return finaliseRow(table, row, {
oldTable: dbTable,
updateFormula: true,
})
}
exports.fetchView = async ctx => {
- const appId = ctx.appId
const viewName = ctx.params.viewName
// if this is a table view being looked for just transfer to that
@@ -185,7 +180,7 @@ exports.fetchView = async ctx => {
return exports.fetch(ctx)
}
- const db = new CouchDB(appId)
+ const db = getAppDB()
const { calculation, group, field } = ctx.query
const viewInfo = await getView(db, viewName)
let response
@@ -212,7 +207,7 @@ exports.fetchView = async ctx => {
schema: {},
}
}
- rows = await outputProcessing(ctx, table, response.rows)
+ rows = await outputProcessing(table, response.rows)
}
if (calculation === CALCULATION_TYPES.STATS) {
@@ -239,27 +234,24 @@ exports.fetchView = async ctx => {
}
exports.fetch = async ctx => {
- const appId = ctx.appId
- const db = new CouchDB(appId)
+ const db = getAppDB()
const tableId = ctx.params.tableId
let table = await db.get(tableId)
let rows = await getRawTableData(ctx, db, tableId)
- return outputProcessing(ctx, table, rows)
+ return outputProcessing(table, rows)
}
exports.find = async ctx => {
- const appId = ctx.appId
- const db = new CouchDB(appId)
+ const db = getAppDB()
const table = await db.get(ctx.params.tableId)
- let row = await findRow(ctx, db, ctx.params.tableId, ctx.params.rowId)
- row = await outputProcessing(ctx, table, row)
+ let row = await findRow(ctx, ctx.params.tableId, ctx.params.rowId)
+ row = await outputProcessing(table, row)
return row
}
exports.destroy = async function (ctx) {
- const appId = ctx.appId
- const db = new CouchDB(appId)
+ const db = getAppDB()
const { _id, _rev } = ctx.request.body
let row = await db.get(_id)
@@ -268,18 +260,17 @@ exports.destroy = async function (ctx) {
}
const table = await db.get(row.tableId)
// update the row to include full relationships before deleting them
- row = await outputProcessing(ctx, table, row, { squash: false })
+ row = await outputProcessing(table, row, { squash: false })
// now remove the relationships
await linkRows.updateLinks({
- appId,
eventType: linkRows.EventType.ROW_DELETE,
row,
tableId: row.tableId,
})
// remove any attachments that were on the row from object storage
- await cleanupAttachments(appId, table, { row })
+ await cleanupAttachments(table, { row })
// remove any static formula
- await updateRelatedFormula(appId, table, row)
+ await updateRelatedFormula(table, row)
let response
if (ctx.params.tableId === InternalTables.USER_METADATA) {
@@ -295,20 +286,18 @@ exports.destroy = async function (ctx) {
}
exports.bulkDestroy = async ctx => {
- const appId = ctx.appId
- const db = new CouchDB(appId)
+ const db = getAppDB()
const tableId = ctx.params.tableId
const table = await db.get(tableId)
let { rows } = ctx.request.body
// before carrying out any updates, make sure the rows are ready to be returned
// they need to be the full rows (including previous relationships) for automations
- rows = await outputProcessing(ctx, table, rows, { squash: false })
+ rows = await outputProcessing(table, rows, { squash: false })
// remove the relationships first
let updates = rows.map(row =>
linkRows.updateLinks({
- appId,
eventType: linkRows.EventType.ROW_DELETE,
row,
tableId: row.tableId,
@@ -327,8 +316,8 @@ exports.bulkDestroy = async ctx => {
await db.bulkDocs(rows.map(row => ({ ...row, _deleted: true })))
}
// remove any attachments that were on the rows from object storage
- await cleanupAttachments(appId, table, { rows })
- await updateRelatedFormula(appId, table, rows)
+ await cleanupAttachments(table, { rows })
+ await updateRelatedFormula(table, rows)
await Promise.all(updates)
return { response: { ok: true }, rows }
}
@@ -339,28 +328,27 @@ exports.search = async ctx => {
return { rows: await exports.fetch(ctx) }
}
- const appId = ctx.appId
const { tableId } = ctx.params
- const db = new CouchDB(appId)
+ const db = getAppDB()
const { paginate, query, ...params } = ctx.request.body
params.version = ctx.version
params.tableId = tableId
let response
if (paginate) {
- response = await paginatedSearch(appId, query, params)
+ response = await paginatedSearch(query, params)
} else {
- response = await fullSearch(appId, query, params)
+ response = await fullSearch(query, params)
}
// Enrich search results with relationships
if (response.rows && response.rows.length) {
// enrich with global users if from users table
if (tableId === InternalTables.USER_METADATA) {
- response.rows = await getGlobalUsersFromMetadata(appId, response.rows)
+ response.rows = await getGlobalUsersFromMetadata(response.rows)
}
const table = await db.get(tableId)
- response.rows = await outputProcessing(ctx, table, response.rows)
+ response.rows = await outputProcessing(table, response.rows)
}
return response
@@ -368,25 +356,22 @@ exports.search = async ctx => {
exports.validate = async ctx => {
return validate({
- appId: ctx.appId,
tableId: ctx.params.tableId,
row: ctx.request.body,
})
}
exports.fetchEnrichedRow = async ctx => {
- const appId = ctx.appId
- const db = new CouchDB(appId)
+ const db = getAppDB()
const tableId = ctx.params.tableId
const rowId = ctx.params.rowId
// need table to work out where links go in row
let [table, row] = await Promise.all([
db.get(tableId),
- findRow(ctx, db, tableId, rowId),
+ findRow(ctx, tableId, rowId),
])
// get the link docs
const linkVals = await linkRows.getLinkDocuments({
- appId,
tableId,
rowId,
})
@@ -413,7 +398,7 @@ exports.fetchEnrichedRow = async ctx => {
for (let [tableId, rows] of Object.entries(groups)) {
// need to include the IDs in these rows for any links they may have
linkedRows = linkedRows.concat(
- await outputProcessing(ctx, tables[tableId], rows)
+ await outputProcessing(tables[tableId], rows)
)
}
diff --git a/packages/server/src/api/controllers/row/internalSearch.js b/packages/server/src/api/controllers/row/internalSearch.js
index a185386b7a..611b3272f3 100644
--- a/packages/server/src/api/controllers/row/internalSearch.js
+++ b/packages/server/src/api/controllers/row/internalSearch.js
@@ -1,14 +1,14 @@
const { SearchIndexes } = require("../../../db/utils")
const fetch = require("node-fetch")
const { getCouchUrl } = require("@budibase/backend-core/db")
+const { getAppId } = require("@budibase/backend-core/context")
/**
* Class to build lucene query URLs.
* Optionally takes a base lucene query object.
*/
class QueryBuilder {
- constructor(appId, base) {
- this.appId = appId
+ constructor(base) {
this.query = {
string: {},
fuzzy: {},
@@ -241,7 +241,8 @@ class QueryBuilder {
}
async run() {
- const url = `${getCouchUrl()}/${this.appId}/_design/database/_search/${
+ const appId = getAppId()
+ const url = `${getCouchUrl()}/${appId}/_design/database/_search/${
SearchIndexes.ROWS
}`
const body = this.buildSearchBody()
@@ -278,7 +279,6 @@ const runQuery = async (url, body) => {
* Gets round the fixed limit of 200 results from a query by fetching as many
* pages as required and concatenating the results. This recursively operates
* until enough results have been found.
- * @param appId {string} The app ID to search
* @param query {object} The JSON query structure
* @param params {object} The search params including:
* tableId {string} The table ID to search
@@ -291,7 +291,7 @@ const runQuery = async (url, body) => {
* rows {array|null} Current results in the recursive search
* @returns {Promise<*[]|*>}
*/
-const recursiveSearch = async (appId, query, params) => {
+const recursiveSearch = async (query, params) => {
const bookmark = params.bookmark
const rows = params.rows || []
if (rows.length >= params.limit) {
@@ -301,7 +301,7 @@ const recursiveSearch = async (appId, query, params) => {
if (rows.length > params.limit - 200) {
pageSize = params.limit - rows.length
}
- const page = await new QueryBuilder(appId, query)
+ const page = await new QueryBuilder(query)
.setVersion(params.version)
.setTable(params.tableId)
.setBookmark(bookmark)
@@ -321,14 +321,13 @@ const recursiveSearch = async (appId, query, params) => {
bookmark: page.bookmark,
rows: [...rows, ...page.rows],
}
- return await recursiveSearch(appId, query, newParams)
+ return await recursiveSearch(query, newParams)
}
/**
* Performs a paginated search. A bookmark will be returned to allow the next
* page to be fetched. There is a max limit off 200 results per page in a
* paginated search.
- * @param appId {string} The app ID to search
* @param query {object} The JSON query structure
* @param params {object} The search params including:
* tableId {string} The table ID to search
@@ -340,13 +339,13 @@ const recursiveSearch = async (appId, query, params) => {
* bookmark {string} The bookmark to resume from
* @returns {Promise<{hasNextPage: boolean, rows: *[]}>}
*/
-exports.paginatedSearch = async (appId, query, params) => {
+exports.paginatedSearch = async (query, params) => {
let limit = params.limit
if (limit == null || isNaN(limit) || limit < 0) {
limit = 50
}
limit = Math.min(limit, 200)
- const search = new QueryBuilder(appId, query)
+ const search = new QueryBuilder(query)
.setVersion(params.version)
.setTable(params.tableId)
.setSort(params.sort)
@@ -375,7 +374,6 @@ exports.paginatedSearch = async (appId, query, params) => {
* desired amount of results. There is a limit of 1000 results to avoid
* heavy performance hits, and to avoid client components breaking from
* handling too much data.
- * @param appId {string} The app ID to search
* @param query {object} The JSON query structure
* @param params {object} The search params including:
* tableId {string} The table ID to search
@@ -386,12 +384,12 @@ exports.paginatedSearch = async (appId, query, params) => {
* limit {number} The desired number of results
* @returns {Promise<{rows: *}>}
*/
-exports.fullSearch = async (appId, query, params) => {
+exports.fullSearch = async (query, params) => {
let limit = params.limit
if (limit == null || isNaN(limit) || limit < 0) {
limit = 1000
}
params.limit = Math.min(limit, 1000)
- const rows = await recursiveSearch(appId, query, params)
+ const rows = await recursiveSearch(query, params)
return { rows }
}
diff --git a/packages/server/src/api/controllers/row/staticFormula.js b/packages/server/src/api/controllers/row/staticFormula.js
index fc0edd1cb4..bc62c08198 100644
--- a/packages/server/src/api/controllers/row/staticFormula.js
+++ b/packages/server/src/api/controllers/row/staticFormula.js
@@ -1,4 +1,3 @@
-const CouchDB = require("../../../db")
const { getRowParams } = require("../../../db/utils")
const {
outputProcessing,
@@ -8,6 +7,7 @@ const {
const { FieldTypes, FormulaTypes } = require("../../../constants")
const { isEqual } = require("lodash")
const { cloneDeep } = require("lodash/fp")
+const { getAppDB } = require("@budibase/backend-core/context")
/**
* This function runs through a list of enriched rows, looks at the rows which
@@ -15,8 +15,8 @@ const { cloneDeep } = require("lodash/fp")
* updated.
* NOTE: this will only for affect static formulas.
*/
-exports.updateRelatedFormula = async (appId, table, enrichedRows) => {
- const db = new CouchDB(appId)
+exports.updateRelatedFormula = async (table, enrichedRows) => {
+ const db = getAppDB()
// no formula to update, we're done
if (!table.relatedFormula) {
return
@@ -57,7 +57,7 @@ exports.updateRelatedFormula = async (appId, table, enrichedRows) => {
// re-enrich rows for all the related, don't update the related formula for them
promises = promises.concat(
relatedRows[tableId].map(related =>
- exports.finaliseRow(appId, relatedTable, related, {
+ exports.finaliseRow(relatedTable, related, {
updateFormula: false,
})
)
@@ -69,8 +69,8 @@ exports.updateRelatedFormula = async (appId, table, enrichedRows) => {
await Promise.all(promises)
}
-exports.updateAllFormulasInTable = async (appId, table) => {
- const db = new CouchDB(appId)
+exports.updateAllFormulasInTable = async table => {
+ const db = getAppDB()
// start by getting the raw rows (which will be written back to DB after update)
let rows = (
await db.allDocs(
@@ -81,7 +81,7 @@ exports.updateAllFormulasInTable = async (appId, table) => {
).rows.map(row => row.doc)
// now enrich the rows, note the clone so that we have the base state of the
// rows so that we don't write any of the enriched information back
- let enrichedRows = await outputProcessing({ appId }, table, cloneDeep(rows), {
+ let enrichedRows = await outputProcessing(table, cloneDeep(rows), {
squash: false,
})
const updatedRows = []
@@ -109,15 +109,14 @@ exports.updateAllFormulasInTable = async (appId, table) => {
* expects the row to be totally enriched/contain all relationships.
*/
exports.finaliseRow = async (
- appId,
table,
row,
{ oldTable, updateFormula } = { updateFormula: true }
) => {
- const db = new CouchDB(appId)
+ const db = getAppDB()
row.type = "row"
// process the row before return, to include relationships
- let enrichedRow = await outputProcessing({ appId }, table, cloneDeep(row), {
+ let enrichedRow = await outputProcessing(table, cloneDeep(row), {
squash: false,
})
// use enriched row to generate formulas for saving, specifically only use as context
@@ -151,7 +150,7 @@ exports.finaliseRow = async (
enrichedRow = await processFormulas(table, enrichedRow, { dynamic: false })
// this updates the related formulas in other rows based on the relations to this row
if (updateFormula) {
- await exports.updateRelatedFormula(appId, table, enrichedRow)
+ await exports.updateRelatedFormula(table, enrichedRow)
}
return { row: enrichedRow, table }
}
diff --git a/packages/server/src/api/controllers/row/utils.js b/packages/server/src/api/controllers/row/utils.js
index 51bc03eba4..4235e70127 100644
--- a/packages/server/src/api/controllers/row/utils.js
+++ b/packages/server/src/api/controllers/row/utils.js
@@ -1,11 +1,11 @@
const validateJs = require("validate.js")
const { cloneDeep } = require("lodash/fp")
-const CouchDB = require("../../../db")
const { InternalTables } = require("../../../db/utils")
const userController = require("../user")
const { FieldTypes } = require("../../../constants")
const { processStringSync } = require("@budibase/string-templates")
const { makeExternalQuery } = require("../../../integrations/base/utils")
+const { getAppDB } = require("@budibase/backend-core/context")
validateJs.extend(validateJs.validators.datetime, {
parse: function (value) {
@@ -17,14 +17,15 @@ validateJs.extend(validateJs.validators.datetime, {
},
})
-exports.getDatasourceAndQuery = async (appId, json) => {
+exports.getDatasourceAndQuery = async json => {
const datasourceId = json.endpoint.datasourceId
- const db = new CouchDB(appId)
+ const db = getAppDB()
const datasource = await db.get(datasourceId)
return makeExternalQuery(datasource, json)
}
-exports.findRow = async (ctx, db, tableId, rowId) => {
+exports.findRow = async (ctx, tableId, rowId) => {
+ const db = getAppDB()
let row
// TODO remove special user case in future
if (tableId === InternalTables.USER_METADATA) {
@@ -42,9 +43,9 @@ exports.findRow = async (ctx, db, tableId, rowId) => {
return row
}
-exports.validate = async ({ appId, tableId, row, table }) => {
+exports.validate = async ({ tableId, row, table }) => {
if (!table) {
- const db = new CouchDB(appId)
+ const db = getAppDB()
table = await db.get(tableId)
}
const errors = {}
diff --git a/packages/server/src/api/controllers/screen.js b/packages/server/src/api/controllers/screen.js
index 5e0eeb5176..e166ab3eb8 100644
--- a/packages/server/src/api/controllers/screen.js
+++ b/packages/server/src/api/controllers/screen.js
@@ -1,10 +1,9 @@
-const CouchDB = require("../../db")
const { getScreenParams, generateScreenID } = require("../../db/utils")
const { AccessController } = require("@budibase/backend-core/roles")
+const { getAppDB } = require("@budibase/backend-core/context")
exports.fetch = async ctx => {
- const appId = ctx.appId
- const db = new CouchDB(appId)
+ const db = getAppDB()
const screens = (
await db.allDocs(
@@ -14,15 +13,14 @@ exports.fetch = async ctx => {
)
).rows.map(element => element.doc)
- ctx.body = await new AccessController(appId).checkScreensAccess(
+ ctx.body = await new AccessController().checkScreensAccess(
screens,
ctx.user.role._id
)
}
exports.save = async ctx => {
- const appId = ctx.appId
- const db = new CouchDB(appId)
+ const db = getAppDB()
let screen = ctx.request.body
if (!screen._id) {
@@ -39,7 +37,7 @@ exports.save = async ctx => {
}
exports.destroy = async ctx => {
- const db = new CouchDB(ctx.appId)
+ const db = getAppDB()
await db.remove(ctx.params.screenId, ctx.params.screenRev)
ctx.body = {
message: "Screen deleted successfully",
diff --git a/packages/server/src/api/controllers/static/index.js b/packages/server/src/api/controllers/static/index.js
index 11bb14e282..82e66ab545 100644
--- a/packages/server/src/api/controllers/static/index.js
+++ b/packages/server/src/api/controllers/static/index.js
@@ -6,7 +6,6 @@ const uuid = require("uuid")
const { ObjectStoreBuckets } = require("../../../constants")
const { processString } = require("@budibase/string-templates")
const { getAllApps } = require("@budibase/backend-core/db")
-const CouchDB = require("../../../db")
const {
loadHandlebarsFile,
NODE_MODULES_PATH,
@@ -17,6 +16,7 @@ const { clientLibraryPath } = require("../../../utilities")
const { upload } = require("../../../utilities/fileSystem")
const { attachmentsRelativeURL } = require("../../../utilities")
const { DocumentTypes } = require("../../../db/utils")
+const { getAppDB, updateAppId } = require("@budibase/backend-core/context")
const AWS = require("aws-sdk")
const AWS_REGION = env.AWS_REGION ? env.AWS_REGION : "eu-west-1"
@@ -44,16 +44,14 @@ async function getAppIdFromUrl(ctx) {
let possibleAppUrl = `/${encodeURI(ctx.params.appId).toLowerCase()}`
// search prod apps for a url that matches, exclude dev where id is always used
- const apps = await getAllApps(CouchDB, { dev: false })
+ const apps = await getAllApps({ dev: false })
const app = apps.filter(
a => a.url && a.url.toLowerCase() === possibleAppUrl
)[0]
- if (app && app.appId) {
- return app.appId
- } else {
- return ctx.params.appId
- }
+ const appId = app && app.appId ? app.appId : ctx.params.appId
+ updateAppId(appId)
+ return appId
}
exports.serveBuilder = async function (ctx) {
@@ -85,7 +83,7 @@ exports.uploadFile = async function (ctx) {
exports.serveApp = async function (ctx) {
let appId = await getAppIdFromUrl(ctx)
const App = require("./templates/BudibaseApp.svelte").default
- const db = new CouchDB(appId, { skip_setup: true })
+ const db = getAppDB({ skip_setup: true })
const appInfo = await db.get(DocumentTypes.APP_METADATA)
const { head, html, css } = App.render({
@@ -111,7 +109,7 @@ exports.serveClientLibrary = async function (ctx) {
}
exports.getSignedUploadURL = async function (ctx) {
- const database = new CouchDB(ctx.appId)
+ const database = getAppDB()
// Ensure datasource is valid
let datasource
diff --git a/packages/server/src/api/controllers/table/bulkFormula.js b/packages/server/src/api/controllers/table/bulkFormula.js
index 1866d8e650..27f62415c9 100644
--- a/packages/server/src/api/controllers/table/bulkFormula.js
+++ b/packages/server/src/api/controllers/table/bulkFormula.js
@@ -1,10 +1,10 @@
-const CouchDB = require("../../../db")
const { FieldTypes, FormulaTypes } = require("../../../constants")
const { getAllInternalTables, clearColumns } = require("./utils")
const { doesContainStrings } = require("@budibase/string-templates")
const { cloneDeep } = require("lodash/fp")
const { isEqual, uniq } = require("lodash")
const { updateAllFormulasInTable } = require("../row/staticFormula")
+const { getAppDB } = require("@budibase/backend-core/context")
function isStaticFormula(column) {
return (
@@ -37,14 +37,9 @@ function getFormulaThatUseColumn(table, columnNames) {
* This functions checks for when a related table, column or related column is deleted, if any
* tables need to have the formula column removed.
*/
-async function checkIfFormulaNeedsCleared(
- appId,
- table,
- { oldTable, deletion }
-) {
- const db = new CouchDB(appId)
+async function checkIfFormulaNeedsCleared(table, { oldTable, deletion }) {
// start by retrieving all tables, remove the current table from the list
- const tables = (await getAllInternalTables(appId)).filter(
+ const tables = (await getAllInternalTables()).filter(
tbl => tbl._id !== table._id
)
const schemaToUse = oldTable ? oldTable.schema : table.schema
@@ -60,7 +55,7 @@ async function checkIfFormulaNeedsCleared(
}
const columnsToDelete = getFormulaThatUseColumn(tableToUse, removed.name)
if (columnsToDelete.length > 0) {
- await clearColumns(db, table, columnsToDelete)
+ await clearColumns(table, columnsToDelete)
}
// need a special case, where a column has been removed from this table, but was used
// in a different, related tables formula
@@ -85,7 +80,7 @@ async function checkIfFormulaNeedsCleared(
)
}
if (relatedFormulaToRemove.length > 0) {
- await clearColumns(db, relatedTable, uniq(relatedFormulaToRemove))
+ await clearColumns(relatedTable, uniq(relatedFormulaToRemove))
}
}
}
@@ -99,13 +94,12 @@ async function checkIfFormulaNeedsCleared(
* specifically only for static formula.
*/
async function updateRelatedFormulaLinksOnTables(
- appId,
table,
{ deletion } = { deletion: false }
) {
- const db = new CouchDB(appId)
+ const db = getAppDB()
// start by retrieving all tables, remove the current table from the list
- const tables = (await getAllInternalTables(appId)).filter(
+ const tables = (await getAllInternalTables()).filter(
tbl => tbl._id !== table._id
)
// clone the tables, so we can compare at end
@@ -155,7 +149,7 @@ async function updateRelatedFormulaLinksOnTables(
}
}
-async function checkIfFormulaUpdated(appId, table, { oldTable }) {
+async function checkIfFormulaUpdated(table, { oldTable }) {
// look to see if any formula values have changed
const shouldUpdate = Object.values(table.schema).find(
column =>
@@ -166,18 +160,14 @@ async function checkIfFormulaUpdated(appId, table, { oldTable }) {
)
// if a static formula column has updated, then need to run the update
if (shouldUpdate != null) {
- await updateAllFormulasInTable(appId, table)
+ await updateAllFormulasInTable(table)
}
}
-exports.runStaticFormulaChecks = async (
- appId,
- table,
- { oldTable, deletion }
-) => {
- await updateRelatedFormulaLinksOnTables(appId, table, { deletion })
- await checkIfFormulaNeedsCleared(appId, table, { oldTable, deletion })
+exports.runStaticFormulaChecks = async (table, { oldTable, deletion }) => {
+ await updateRelatedFormulaLinksOnTables(table, { deletion })
+ await checkIfFormulaNeedsCleared(table, { oldTable, deletion })
if (!deletion) {
- await checkIfFormulaUpdated(appId, table, { oldTable })
+ await checkIfFormulaUpdated(table, { oldTable })
}
}
diff --git a/packages/server/src/api/controllers/table/external.js b/packages/server/src/api/controllers/table/external.js
index 2453ca7a37..b27eebb0c4 100644
--- a/packages/server/src/api/controllers/table/external.js
+++ b/packages/server/src/api/controllers/table/external.js
@@ -1,4 +1,3 @@
-const CouchDB = require("../../../db")
const {
buildExternalTableId,
breakExternalTableId,
@@ -19,6 +18,7 @@ const { makeExternalQuery } = require("../../../integrations/base/utils")
const { cloneDeep } = require("lodash/fp")
const csvParser = require("../../../utilities/csvParser")
const { handleRequest } = require("../row/external")
+const { getAppDB } = require("@budibase/backend-core/context")
async function makeTableRequest(
datasource,
@@ -159,7 +159,6 @@ function isRelationshipSetup(column) {
}
exports.save = async function (ctx) {
- const appId = ctx.appId
const table = ctx.request.body
// can't do this right now
delete table.dataImport
@@ -176,14 +175,14 @@ exports.save = async function (ctx) {
let oldTable
if (ctx.request.body && ctx.request.body._id) {
- oldTable = await getTable(appId, ctx.request.body._id)
+ oldTable = await getTable(ctx.request.body._id)
}
if (hasTypeChanged(tableToSave, oldTable)) {
ctx.throw(400, "A column type has changed.")
}
- const db = new CouchDB(appId)
+ const db = getAppDB()
const datasource = await db.get(datasourceId)
const oldTables = cloneDeep(datasource.entities)
const tables = datasource.entities
@@ -267,14 +266,13 @@ exports.save = async function (ctx) {
}
exports.destroy = async function (ctx) {
- const appId = ctx.appId
- const tableToDelete = await getTable(appId, ctx.params.tableId)
+ const tableToDelete = await getTable(ctx.params.tableId)
if (!tableToDelete || !tableToDelete.created) {
ctx.throw(400, "Cannot delete tables which weren't created in Budibase.")
}
const datasourceId = getDatasourceId(tableToDelete)
- const db = new CouchDB(appId)
+ const db = getAppDB()
const datasource = await db.get(datasourceId)
const tables = datasource.entities
@@ -290,8 +288,7 @@ exports.destroy = async function (ctx) {
}
exports.bulkImport = async function (ctx) {
- const appId = ctx.appId
- const table = await getTable(appId, ctx.params.tableId)
+ const table = await getTable(ctx.params.tableId)
const { dataImport } = ctx.request.body
if (!dataImport || !dataImport.schema || !dataImport.csvString) {
ctx.throw(400, "Provided data import information is invalid.")
@@ -300,7 +297,7 @@ exports.bulkImport = async function (ctx) {
...dataImport,
existingTable: table,
})
- await handleRequest(appId, DataSourceOperation.BULK_CREATE, table._id, {
+ await handleRequest(DataSourceOperation.BULK_CREATE, table._id, {
rows,
})
return table
diff --git a/packages/server/src/api/controllers/table/index.js b/packages/server/src/api/controllers/table/index.js
index 2f6bfd0cb3..3e1845b91f 100644
--- a/packages/server/src/api/controllers/table/index.js
+++ b/packages/server/src/api/controllers/table/index.js
@@ -1,9 +1,9 @@
-const CouchDB = require("../../../db")
const internal = require("./internal")
const external = require("./external")
const csvParser = require("../../../utilities/csvParser")
const { isExternalTable, isSQL } = require("../../../integrations/utils")
const { getDatasourceParams } = require("../../../db/utils")
+const { getAppDB } = require("@budibase/backend-core/context")
const { getTable, getAllInternalTables } = require("./utils")
function pickApi({ tableId, table }) {
@@ -20,9 +20,9 @@ function pickApi({ tableId, table }) {
// covers both internal and external
exports.fetch = async function (ctx) {
- const db = new CouchDB(ctx.appId)
+ const db = getAppDB()
- const internal = await getAllInternalTables(ctx.appId)
+ const internal = await getAllInternalTables()
const externalTables = await db.allDocs(
getDatasourceParams("plus", {
@@ -49,7 +49,7 @@ exports.fetch = async function (ctx) {
exports.find = async function (ctx) {
const tableId = ctx.params.id
- ctx.body = await getTable(ctx.appId, tableId)
+ ctx.body = await getTable(tableId)
}
exports.save = async function (ctx) {
@@ -88,7 +88,7 @@ exports.validateCSVSchema = async function (ctx) {
const { csvString, schema = {}, tableId } = ctx.request.body
let existingTable
if (tableId) {
- existingTable = await getTable(ctx.appId, tableId)
+ existingTable = await getTable(tableId)
}
let result = await csvParser.parse(csvString, schema)
if (existingTable) {
diff --git a/packages/server/src/api/controllers/table/internal.js b/packages/server/src/api/controllers/table/internal.js
index f38a114c25..476e7a52af 100644
--- a/packages/server/src/api/controllers/table/internal.js
+++ b/packages/server/src/api/controllers/table/internal.js
@@ -1,4 +1,3 @@
-const CouchDB = require("../../../db")
const linkRows = require("../../../db/linkedRows")
const { getRowParams, generateTableID } = require("../../../db/utils")
const { FieldTypes } = require("../../../constants")
@@ -9,12 +8,13 @@ const {
handleDataImport,
} = require("./utils")
const usageQuota = require("../../../utilities/usageQuota")
+const { getAppDB } = require("@budibase/backend-core/context")
+const env = require("../../../environment")
const { cleanupAttachments } = require("../../../utilities/rowProcessor")
const { runStaticFormulaChecks } = require("./bulkFormula")
exports.save = async function (ctx) {
- const appId = ctx.appId
- const db = new CouchDB(appId)
+ const db = getAppDB()
const { dataImport, ...rest } = ctx.request.body
let tableToSave = {
type: "table",
@@ -36,8 +36,7 @@ exports.save = async function (ctx) {
// saving a table is a complex operation, involving many different steps, this
// has been broken out into a utility to make it more obvious/easier to manipulate
const tableSaveFunctions = new TableSaveFunctions({
- db,
- ctx,
+ user: ctx.user,
oldTable,
dataImport,
})
@@ -82,7 +81,6 @@ exports.save = async function (ctx) {
// update linked rows
try {
const linkResp = await linkRows.updateLinks({
- appId,
eventType: oldTable
? linkRows.EventType.TABLE_UPDATED
: linkRows.EventType.TABLE_SAVE,
@@ -107,13 +105,12 @@ exports.save = async function (ctx) {
tableToSave = await tableSaveFunctions.after(tableToSave)
// has to run after, make sure it has _id
- await runStaticFormulaChecks(appId, tableToSave, { oldTable })
+ await runStaticFormulaChecks(tableToSave, { oldTable })
return tableToSave
}
exports.destroy = async function (ctx) {
- const appId = ctx.appId
- const db = new CouchDB(appId)
+ const db = getAppDB()
const tableToDelete = await db.get(ctx.params.tableId)
// Delete all rows for that table
@@ -127,7 +124,6 @@ exports.destroy = async function (ctx) {
// update linked rows
await linkRows.updateLinks({
- appId,
eventType: linkRows.EventType.TABLE_DELETE,
table: tableToDelete,
})
@@ -136,24 +132,25 @@ exports.destroy = async function (ctx) {
await db.remove(tableToDelete)
// remove table search index
- const currentIndexes = await db.getIndexes()
- const existingIndex = currentIndexes.indexes.find(
- existing => existing.name === `search:${ctx.params.tableId}`
- )
- if (existingIndex) {
- await db.deleteIndex(existingIndex)
+ if (!env.isTest()) {
+ const currentIndexes = await db.getIndexes()
+ const existingIndex = currentIndexes.indexes.find(
+ existing => existing.name === `search:${ctx.params.tableId}`
+ )
+ if (existingIndex) {
+ await db.deleteIndex(existingIndex)
+ }
}
// has to run after, make sure it has _id
- await runStaticFormulaChecks(appId, tableToDelete, { deletion: true })
- await cleanupAttachments(appId, tableToDelete, { rows })
+ await runStaticFormulaChecks(tableToDelete, { deletion: true })
+ await cleanupAttachments(tableToDelete, { rows })
return tableToDelete
}
exports.bulkImport = async function (ctx) {
- const appId = ctx.appId
- const table = await getTable(appId, ctx.params.tableId)
+ const table = await getTable(ctx.params.tableId)
const { dataImport } = ctx.request.body
- await handleDataImport(appId, ctx.user, table, dataImport)
+ await handleDataImport(ctx.user, table, dataImport)
return table
}
diff --git a/packages/server/src/api/controllers/table/utils.js b/packages/server/src/api/controllers/table/utils.js
index f1907666c9..0e299dbd0d 100644
--- a/packages/server/src/api/controllers/table/utils.js
+++ b/packages/server/src/api/controllers/table/utils.js
@@ -1,4 +1,3 @@
-const CouchDB = require("../../../db")
const csvParser = require("../../../utilities/csvParser")
const {
getRowParams,
@@ -26,10 +25,11 @@ const {
const { getViews, saveView } = require("../view/utils")
const viewTemplate = require("../view/viewBuilder")
const usageQuota = require("../../../utilities/usageQuota")
+const { getAppDB } = require("@budibase/backend-core/context")
const { cloneDeep } = require("lodash/fp")
-exports.clearColumns = async (appId, table, columnNames) => {
- const db = new CouchDB(appId)
+exports.clearColumns = async (table, columnNames) => {
+ const db = getAppDB()
const rows = await db.allDocs(
getRowParams(table._id, null, {
include_docs: true,
@@ -43,7 +43,8 @@ exports.clearColumns = async (appId, table, columnNames) => {
)
}
-exports.checkForColumnUpdates = async (appId, db, oldTable, updatedTable) => {
+exports.checkForColumnUpdates = async (oldTable, updatedTable) => {
+ const db = getAppDB()
let updatedRows = []
const rename = updatedTable._rename
let deletedColumns = []
@@ -73,9 +74,9 @@ exports.checkForColumnUpdates = async (appId, db, oldTable, updatedTable) => {
})
// cleanup any attachments from object storage for deleted attachment columns
- await cleanupAttachments(appId, updatedTable, { oldTable, rows: rawRows })
+ await cleanupAttachments(updatedTable, { oldTable, rows: rawRows })
// Update views
- await exports.checkForViewUpdates(db, updatedTable, rename, deletedColumns)
+ await exports.checkForViewUpdates(updatedTable, rename, deletedColumns)
delete updatedTable._rename
}
return { rows: updatedRows, table: updatedTable }
@@ -102,12 +103,12 @@ exports.makeSureTableUpToDate = (table, tableToSave) => {
return tableToSave
}
-exports.handleDataImport = async (appId, user, table, dataImport) => {
+exports.handleDataImport = async (user, table, dataImport) => {
if (!dataImport || !dataImport.csvString) {
return table
}
- const db = new CouchDB(appId)
+ const db = getAppDB()
// Populate the table with rows imported from CSV in a bulk update
const data = await csvParser.transform({
...dataImport,
@@ -152,8 +153,8 @@ exports.handleDataImport = async (appId, user, table, dataImport) => {
return table
}
-exports.handleSearchIndexes = async (appId, table) => {
- const db = new CouchDB(appId)
+exports.handleSearchIndexes = async table => {
+ const db = getAppDB()
// create relevant search indexes
if (table.indexes && table.indexes.length > 0) {
const currentIndexes = await db.getIndexes()
@@ -210,12 +211,9 @@ exports.checkStaticTables = table => {
}
class TableSaveFunctions {
- constructor({ db, ctx, oldTable, dataImport }) {
- this.db = db
- this.ctx = ctx
- if (this.ctx && this.ctx.user) {
- this.appId = this.ctx.appId
- }
+ constructor({ user, oldTable, dataImport }) {
+ this.db = getAppDB()
+ this.user = user
this.oldTable = oldTable
this.dataImport = dataImport
// any rows that need updated
@@ -233,25 +231,15 @@ class TableSaveFunctions {
// when confirmed valid
async mid(table) {
- let response = await exports.checkForColumnUpdates(
- this.appId,
- this.db,
- this.oldTable,
- table
- )
+ let response = await exports.checkForColumnUpdates(this.oldTable, table)
this.rows = this.rows.concat(response.rows)
return table
}
// after saving
async after(table) {
- table = await exports.handleSearchIndexes(this.appId, table)
- table = await exports.handleDataImport(
- this.appId,
- this.ctx.user,
- table,
- this.dataImport
- )
+ table = await exports.handleSearchIndexes(table)
+ table = await exports.handleDataImport(this.user, table, this.dataImport)
return table
}
@@ -260,8 +248,8 @@ class TableSaveFunctions {
}
}
-exports.getAllInternalTables = async appId => {
- const db = new CouchDB(appId)
+exports.getAllInternalTables = async () => {
+ const db = getAppDB()
const internalTables = await db.allDocs(
getTableParams(null, {
include_docs: true,
@@ -274,8 +262,8 @@ exports.getAllInternalTables = async appId => {
}))
}
-exports.getAllExternalTables = async (appId, datasourceId) => {
- const db = new CouchDB(appId)
+exports.getAllExternalTables = async datasourceId => {
+ const db = getAppDB()
const datasource = await db.get(datasourceId)
if (!datasource || !datasource.entities) {
throw "Datasource is not configured fully."
@@ -283,25 +271,25 @@ exports.getAllExternalTables = async (appId, datasourceId) => {
return datasource.entities
}
-exports.getExternalTable = async (appId, datasourceId, tableName) => {
- const entities = await exports.getAllExternalTables(appId, datasourceId)
+exports.getExternalTable = async (datasourceId, tableName) => {
+ const entities = await exports.getAllExternalTables(datasourceId)
return entities[tableName]
}
-exports.getTable = async (appId, tableId) => {
- const db = new CouchDB(appId)
+exports.getTable = async tableId => {
+ const db = getAppDB()
if (isExternalTable(tableId)) {
let { datasourceId, tableName } = breakExternalTableId(tableId)
const datasource = await db.get(datasourceId)
- const table = await exports.getExternalTable(appId, datasourceId, tableName)
+ const table = await exports.getExternalTable(datasourceId, tableName)
return { ...table, sql: isSQL(datasource) }
} else {
return db.get(tableId)
}
}
-exports.checkForViewUpdates = async (db, table, rename, deletedColumns) => {
- const views = await getViews(db)
+exports.checkForViewUpdates = async (table, rename, deletedColumns) => {
+ const views = await getViews()
const tableViews = views.filter(view => view.meta.tableId === table._id)
// Check each table view to see if impacted by this table action
@@ -363,7 +351,7 @@ exports.checkForViewUpdates = async (db, table, rename, deletedColumns) => {
// Update view if required
if (needsUpdated) {
const newViewTemplate = viewTemplate(view.meta)
- await saveView(db, null, view.name, newViewTemplate)
+ await saveView(null, view.name, newViewTemplate)
if (!newViewTemplate.meta.schema) {
newViewTemplate.meta.schema = table.schema
}
diff --git a/packages/server/src/api/controllers/user.js b/packages/server/src/api/controllers/user.js
index 1bd8bd6a12..7d4ef65994 100644
--- a/packages/server/src/api/controllers/user.js
+++ b/packages/server/src/api/controllers/user.js
@@ -1,4 +1,3 @@
-const CouchDB = require("../../db")
const {
generateUserMetadataID,
getUserMetadataParams,
@@ -11,12 +10,14 @@ const { isEqual } = require("lodash")
const { BUILTIN_ROLE_IDS } = require("@budibase/backend-core/roles")
const {
getDevelopmentAppID,
- getDeployedAppIDs,
+ getProdAppIDs,
+ dbExists,
} = require("@budibase/backend-core/db")
-const { doesDatabaseExist } = require("../../utilities")
const { UserStatus } = require("@budibase/backend-core/constants")
+const { getAppDB, doInAppContext } = require("@budibase/backend-core/context")
-async function rawMetadata(db) {
+async function rawMetadata() {
+ const db = getAppDB()
return (
await db.allDocs(
getUserMetadataParams(null, {
@@ -54,13 +55,10 @@ function combineMetadataAndUser(user, metadata) {
return null
}
-exports.syncGlobalUsers = async appId => {
+exports.syncGlobalUsers = async () => {
// sync user metadata
- const db = new CouchDB(appId)
- const [users, metadata] = await Promise.all([
- getGlobalUsers(appId),
- rawMetadata(db),
- ])
+ const db = getAppDB()
+ const [users, metadata] = await Promise.all([getGlobalUsers(), rawMetadata()])
const toWrite = []
for (let user of users) {
const combined = await combineMetadataAndUser(user, metadata)
@@ -94,7 +92,7 @@ exports.syncUser = async function (ctx) {
let prodAppIds
// if they are a builder then get all production app IDs
if ((user.builder && user.builder.global) || deleting) {
- prodAppIds = await getDeployedAppIDs(CouchDB)
+ prodAppIds = await getProdAppIDs()
} else {
prodAppIds = Object.entries(roles)
.filter(entry => entry[1] !== BUILTIN_ROLE_IDS.PUBLIC)
@@ -104,37 +102,39 @@ exports.syncUser = async function (ctx) {
const roleId = roles[prodAppId]
const devAppId = getDevelopmentAppID(prodAppId)
for (let appId of [prodAppId, devAppId]) {
- if (!(await doesDatabaseExist(appId))) {
+ if (!(await dbExists(appId))) {
continue
}
- const db = new CouchDB(appId)
- const metadataId = generateUserMetadataID(userId)
- let metadata
- try {
- metadata = await db.get(metadataId)
- } catch (err) {
- if (deleting) {
- continue
- }
- metadata = {
- tableId: InternalTables.USER_METADATA,
- }
- }
- // assign the roleId for the metadata doc
- if (roleId) {
- metadata.roleId = roleId
- }
- let combined = !deleting
- ? combineMetadataAndUser(user, metadata)
- : {
- ...metadata,
- status: UserStatus.INACTIVE,
- metadata: BUILTIN_ROLE_IDS.PUBLIC,
+ await doInAppContext(appId, async () => {
+ const db = getAppDB()
+ const metadataId = generateUserMetadataID(userId)
+ let metadata
+ try {
+ metadata = await db.get(metadataId)
+ } catch (err) {
+ if (deleting) {
+ return
}
- // if its null then there was no updates required
- if (combined) {
- await db.put(combined)
- }
+ metadata = {
+ tableId: InternalTables.USER_METADATA,
+ }
+ }
+ // assign the roleId for the metadata doc
+ if (roleId) {
+ metadata.roleId = roleId
+ }
+ let combined = !deleting
+ ? combineMetadataAndUser(user, metadata)
+ : {
+ ...metadata,
+ status: UserStatus.INACTIVE,
+ metadata: BUILTIN_ROLE_IDS.PUBLIC,
+ }
+ // if its null then there was no updates required
+ if (combined) {
+ await db.put(combined)
+ }
+ })
}
}
ctx.body = {
@@ -143,8 +143,8 @@ exports.syncUser = async function (ctx) {
}
exports.fetchMetadata = async function (ctx) {
- const database = new CouchDB(ctx.appId)
- const global = await getGlobalUsers(ctx.appId)
+ const database = getAppDB()
+ const global = await getGlobalUsers()
const metadata = await rawMetadata(database)
const users = []
for (let user of global) {
@@ -173,8 +173,7 @@ exports.updateSelfMetadata = async function (ctx) {
}
exports.updateMetadata = async function (ctx) {
- const appId = ctx.appId
- const db = new CouchDB(appId)
+ const db = getAppDB()
const user = ctx.request.body
// this isn't applicable to the user
delete user.roles
@@ -186,7 +185,7 @@ exports.updateMetadata = async function (ctx) {
}
exports.destroyMetadata = async function (ctx) {
- const db = new CouchDB(ctx.appId)
+ const db = getAppDB()
try {
const dbUser = await db.get(ctx.params.id)
await db.remove(dbUser._id, dbUser._rev)
@@ -209,7 +208,7 @@ exports.setFlag = async function (ctx) {
ctx.throw(400, "Must supply a 'flag' field in request body.")
}
const flagDocId = generateUserFlagID(userId)
- const db = new CouchDB(ctx.appId)
+ const db = getAppDB()
let doc
try {
doc = await db.get(flagDocId)
@@ -224,7 +223,7 @@ exports.setFlag = async function (ctx) {
exports.getFlags = async function (ctx) {
const userId = ctx.user._id
const docId = generateUserFlagID(userId)
- const db = new CouchDB(ctx.appId)
+ const db = getAppDB()
let doc
try {
doc = await db.get(docId)
diff --git a/packages/server/src/api/controllers/view/exporters.js b/packages/server/src/api/controllers/view/exporters.js
index 0cca3b5f89..1232640d0a 100644
--- a/packages/server/src/api/controllers/view/exporters.js
+++ b/packages/server/src/api/controllers/view/exporters.js
@@ -5,8 +5,11 @@ exports.csv = function (headers, rows) {
csv = `${csv}\n${headers
.map(header => {
let val = row[header]
- val = typeof val === "object" ? JSON.stringify(val) : val
- return `"${val}"`.trim()
+ val =
+ typeof val === "object"
+ ? `"${JSON.stringify(val).replace(/"/g, "'")}"`
+ : `"${val}"`
+ return val.trim()
})
.join(",")}`
}
diff --git a/packages/server/src/api/controllers/view/index.js b/packages/server/src/api/controllers/view/index.js
index e3232323bf..fd6b32f3d6 100644
--- a/packages/server/src/api/controllers/view/index.js
+++ b/packages/server/src/api/controllers/view/index.js
@@ -1,4 +1,3 @@
-const CouchDB = require("../../../db")
const viewTemplate = require("./viewBuilder")
const { apiFileReturn } = require("../../../utilities/fileSystem")
const exporters = require("./exporters")
@@ -6,14 +5,14 @@ const { saveView, getView, getViews, deleteView } = require("./utils")
const { fetchView } = require("../row")
const { getTable } = require("../table/utils")
const { FieldTypes } = require("../../../constants")
+const { getAppDB } = require("@budibase/backend-core/context")
exports.fetch = async ctx => {
- const db = new CouchDB(ctx.appId)
- ctx.body = await getViews(db)
+ ctx.body = await getViews()
}
exports.save = async ctx => {
- const db = new CouchDB(ctx.appId)
+ const db = getAppDB()
const { originalName, ...viewToSave } = ctx.request.body
const view = viewTemplate(viewToSave)
@@ -21,7 +20,7 @@ exports.save = async ctx => {
ctx.throw(400, "Cannot create view without a name")
}
- await saveView(db, originalName, viewToSave.name, view)
+ await saveView(originalName, viewToSave.name, view)
// add views to table document
const table = await db.get(ctx.request.body.tableId)
@@ -42,9 +41,9 @@ exports.save = async ctx => {
}
exports.destroy = async ctx => {
- const db = new CouchDB(ctx.appId)
+ const db = getAppDB()
const viewName = decodeURI(ctx.params.viewName)
- const view = await deleteView(db, viewName)
+ const view = await deleteView(viewName)
const table = await db.get(view.meta.tableId)
delete table.views[viewName]
await db.put(table)
@@ -53,9 +52,8 @@ exports.destroy = async ctx => {
}
exports.exportView = async ctx => {
- const db = new CouchDB(ctx.appId)
const viewName = decodeURI(ctx.query.view)
- const view = await getView(db, viewName)
+ const view = await getView(viewName)
const format = ctx.query.format
if (!format || !Object.values(exporters.ExportFormats).includes(format)) {
@@ -83,7 +81,7 @@ exports.exportView = async ctx => {
let schema = view && view.meta && view.meta.schema
if (!schema) {
const tableId = ctx.params.tableId || view.meta.tableId
- const table = await getTable(ctx.appId, tableId)
+ const table = await getTable(tableId)
schema = table.schema
}
diff --git a/packages/server/src/api/controllers/view/utils.js b/packages/server/src/api/controllers/view/utils.js
index 27fccaf47f..59d169ef7f 100644
--- a/packages/server/src/api/controllers/view/utils.js
+++ b/packages/server/src/api/controllers/view/utils.js
@@ -6,8 +6,10 @@ const {
SEPARATOR,
} = require("../../../db/utils")
const env = require("../../../environment")
+const { getAppDB } = require("@budibase/backend-core/context")
-exports.getView = async (db, viewName) => {
+exports.getView = async viewName => {
+ const db = getAppDB()
if (env.SELF_HOSTED) {
const designDoc = await db.get("_design/database")
return designDoc.views[viewName]
@@ -22,7 +24,8 @@ exports.getView = async (db, viewName) => {
}
}
-exports.getViews = async db => {
+exports.getViews = async () => {
+ const db = getAppDB()
const response = []
if (env.SELF_HOSTED) {
const designDoc = await db.get("_design/database")
@@ -54,7 +57,8 @@ exports.getViews = async db => {
return response
}
-exports.saveView = async (db, originalName, viewName, viewTemplate) => {
+exports.saveView = async (originalName, viewName, viewTemplate) => {
+ const db = getAppDB()
if (env.SELF_HOSTED) {
const designDoc = await db.get("_design/database")
designDoc.views = {
@@ -91,7 +95,8 @@ exports.saveView = async (db, originalName, viewName, viewTemplate) => {
}
}
-exports.deleteView = async (db, viewName) => {
+exports.deleteView = async viewName => {
+ const db = getAppDB()
if (env.SELF_HOSTED) {
const designDoc = await db.get("_design/database")
const view = designDoc.views[viewName]
diff --git a/packages/server/src/api/controllers/webhook.js b/packages/server/src/api/controllers/webhook.js
index 0230fb481b..49ab652cbf 100644
--- a/packages/server/src/api/controllers/webhook.js
+++ b/packages/server/src/api/controllers/webhook.js
@@ -1,9 +1,9 @@
-const CouchDB = require("../../db")
const { generateWebhookID, getWebhookParams } = require("../../db/utils")
const toJsonSchema = require("to-json-schema")
const validate = require("jsonschema").validate
const triggers = require("../../automations/triggers")
-const { getDeployedAppID } = require("@budibase/backend-core/db")
+const { getProdAppID } = require("@budibase/backend-core/db")
+const { getAppDB, updateAppId } = require("@budibase/backend-core/context")
const AUTOMATION_DESCRIPTION = "Generated from Webhook Schema"
@@ -23,7 +23,7 @@ exports.WebhookType = {
}
exports.fetch = async ctx => {
- const db = new CouchDB(ctx.appId)
+ const db = getAppDB()
const response = await db.allDocs(
getWebhookParams(null, {
include_docs: true,
@@ -33,7 +33,7 @@ exports.fetch = async ctx => {
}
exports.save = async ctx => {
- const db = new CouchDB(ctx.appId)
+ const db = getAppDB()
const webhook = ctx.request.body
webhook.appId = ctx.appId
@@ -52,12 +52,13 @@ exports.save = async ctx => {
}
exports.destroy = async ctx => {
- const db = new CouchDB(ctx.appId)
+ const db = getAppDB()
ctx.body = await db.remove(ctx.params.id, ctx.params.rev)
}
exports.buildSchema = async ctx => {
- const db = new CouchDB(ctx.params.instance)
+ updateAppId(ctx.params.instance)
+ const db = getAppDB()
const webhook = await db.get(ctx.params.id)
webhook.bodySchema = toJsonSchema(ctx.request.body)
// update the automation outputs
@@ -81,9 +82,10 @@ exports.buildSchema = async ctx => {
}
exports.trigger = async ctx => {
- const prodAppId = getDeployedAppID(ctx.params.instance)
+ const prodAppId = getProdAppID(ctx.params.instance)
+ updateAppId(prodAppId)
try {
- const db = new CouchDB(prodAppId)
+ const db = getAppDB()
const webhook = await db.get(ctx.params.id)
// validate against the schema
if (webhook.bodySchema) {
diff --git a/packages/server/src/api/routes/tests/automation.spec.js b/packages/server/src/api/routes/tests/automation.spec.js
index c412c34fdc..3e5725bb95 100644
--- a/packages/server/src/api/routes/tests/automation.spec.js
+++ b/packages/server/src/api/routes/tests/automation.spec.js
@@ -145,6 +145,7 @@ describe("/automations", () => {
let table = await config.createTable()
automation.definition.trigger.inputs.tableId = table._id
automation.definition.steps[0].inputs.row.tableId = table._id
+ automation.appId = config.appId
automation = await config.createAutomation(automation)
await setup.delay(500)
const res = await testAutomation(config, automation)
diff --git a/packages/server/src/api/routes/tests/misc.spec.js b/packages/server/src/api/routes/tests/misc.spec.js
index ae5c0cca60..e5b87543d2 100644
--- a/packages/server/src/api/routes/tests/misc.spec.js
+++ b/packages/server/src/api/routes/tests/misc.spec.js
@@ -82,7 +82,6 @@ describe("run misc tests", () => {
dataImport.schema[col] = { type: "string" }
}
await tableUtils.handleDataImport(
- config.getAppId(),
{ userId: "test" },
table,
dataImport
diff --git a/packages/server/src/api/routes/tests/query.spec.js b/packages/server/src/api/routes/tests/query.spec.js
index 9357d53cde..dac576836e 100644
--- a/packages/server/src/api/routes/tests/query.spec.js
+++ b/packages/server/src/api/routes/tests/query.spec.js
@@ -230,7 +230,6 @@ describe("/queries", () => {
})
describe("variables", () => {
-
async function preview(datasource, fields) {
return config.previewQuery(request, config, datasource, fields)
}
diff --git a/packages/server/src/api/routes/tests/routing.spec.js b/packages/server/src/api/routes/tests/routing.spec.js
index fdc414448c..d6d05c3322 100644
--- a/packages/server/src/api/routes/tests/routing.spec.js
+++ b/packages/server/src/api/routes/tests/routing.spec.js
@@ -1,10 +1,15 @@
const setup = require("./utilities")
const { basicScreen } = setup.structures
-const { checkBuilderEndpoint } = require("./utilities/TestFunctions")
+const { checkBuilderEndpoint, runInProd } = require("./utilities/TestFunctions")
const { BUILTIN_ROLE_IDS } = require("@budibase/backend-core/roles")
+const { doInAppContext } = require("@budibase/backend-core/context")
const route = "/test"
+// there are checks which are disabled in test env,
+// these checks need to be enabled for this test
+
+
describe("/routing", () => {
let request = setup.getRequest()
let config = setup.getConfig()
@@ -26,20 +31,24 @@ describe("/routing", () => {
describe("fetch", () => {
it("prevents a public user from accessing development app", async () => {
- await request
- .get(`/api/routing/client`)
- .set(config.publicHeaders({ prodApp: false }))
- .expect(302)
+ await runInProd(() => {
+ return request
+ .get(`/api/routing/client`)
+ .set(config.publicHeaders({ prodApp: false }))
+ .expect(302)
+ })
})
it("prevents a non builder from accessing development app", async () => {
- await request
- .get(`/api/routing/client`)
- .set(await config.roleHeaders({
- roleId: BUILTIN_ROLE_IDS.BASIC,
- prodApp: false
- }))
- .expect(302)
+ await runInProd(async () => {
+ return request
+ .get(`/api/routing/client`)
+ .set(await config.roleHeaders({
+ roleId: BUILTIN_ROLE_IDS.BASIC,
+ prodApp: false
+ }))
+ .expect(302)
+ })
})
it("returns the correct routing for basic user", async () => {
const res = await request
diff --git a/packages/server/src/api/routes/tests/row.spec.js b/packages/server/src/api/routes/tests/row.spec.js
index 01284552c5..8354f01ad7 100644
--- a/packages/server/src/api/routes/tests/row.spec.js
+++ b/packages/server/src/api/routes/tests/row.spec.js
@@ -1,6 +1,7 @@
const { outputProcessing } = require("../../../utilities/rowProcessor")
const setup = require("./utilities")
const { basicRow } = setup.structures
+const { doInAppContext } = require("@budibase/backend-core/context")
// mock the fetch for the search system
jest.mock("node-fetch")
@@ -387,10 +388,12 @@ describe("/rows", () => {
})
// the environment needs configured for this
await setup.switchToSelfHosted(async () => {
- const enriched = await outputProcessing({ appId: config.getAppId() }, table, [row])
- expect(enriched[0].attachment[0].url).toBe(
- `/prod-budi-app-assets/${config.getAppId()}/attachments/test/thing.csv`
- )
+ doInAppContext(config.getAppId(), async () => {
+ const enriched = await outputProcessing(table, [row])
+ expect(enriched[0].attachment[0].url).toBe(
+ `/prod-budi-app-assets/${config.getAppId()}/attachments/test/thing.csv`
+ )
+ })
})
})
})
diff --git a/packages/server/src/api/routes/tests/utilities/TestFunctions.js b/packages/server/src/api/routes/tests/utilities/TestFunctions.js
index 9bd54f0d75..c752507d25 100644
--- a/packages/server/src/api/routes/tests/utilities/TestFunctions.js
+++ b/packages/server/src/api/routes/tests/utilities/TestFunctions.js
@@ -1,9 +1,10 @@
const rowController = require("../../../controllers/row")
const appController = require("../../../controllers/application")
-const CouchDB = require("../../../../db")
const { AppStatus } = require("../../../../db/utils")
const { BUILTIN_ROLE_IDS } = require("@budibase/backend-core/roles")
const { TENANT_ID } = require("../../../../tests/utilities/structures")
+const { getAppDB, doInAppContext } = require("@budibase/backend-core/context")
+const env = require("../../../../environment")
function Request(appId, params) {
this.appId = appId
@@ -11,9 +12,15 @@ function Request(appId, params) {
this.request = {}
}
+function runRequest(appId, controlFunc, request) {
+ return doInAppContext(appId, async () => {
+ return controlFunc(request)
+ })
+}
+
exports.getAllTableRows = async config => {
const req = new Request(config.appId, { tableId: config.table._id })
- await rowController.fetch(req)
+ await runRequest(config.appId, rowController.fetch, req)
return req.body
}
@@ -26,14 +33,17 @@ exports.clearAllApps = async (tenantId = TENANT_ID) => {
}
for (let app of apps) {
const { appId } = app
- await appController.delete(new Request(null, { appId }))
+ const req = new Request(null, { appId })
+ await runRequest(appId, appController.delete, req)
}
}
exports.clearAllAutomations = async config => {
const automations = await config.getAllAutomations()
for (let auto of automations) {
- await config.deleteAutomation(auto)
+ await doInAppContext(config.appId, async () => {
+ await config.deleteAutomation(auto)
+ })
}
}
@@ -96,20 +106,32 @@ exports.checkPermissionsEndpoint = async ({
.expect(403)
}
-exports.getDB = config => {
- return new CouchDB(config.getAppId())
+exports.getDB = () => {
+ return getAppDB()
}
exports.testAutomation = async (config, automation) => {
- return await config.request
- .post(`/api/automations/${automation._id}/test`)
- .send({
- row: {
- name: "Test",
- description: "TEST",
- },
- })
- .set(config.defaultHeaders())
- .expect("Content-Type", /json/)
- .expect(200)
+ return runRequest(automation.appId, async () => {
+ return await config.request
+ .post(`/api/automations/${automation._id}/test`)
+ .send({
+ row: {
+ name: "Test",
+ description: "TEST",
+ },
+ })
+ .set(config.defaultHeaders())
+ .expect("Content-Type", /json/)
+ .expect(200)
+ })
+}
+
+exports.runInProd = async func => {
+ const nodeEnv = env.NODE_ENV
+ const workerId = env.JEST_WORKER_ID
+ env._set("NODE_ENV", "PRODUCTION")
+ env._set("JEST_WORKER_ID", null)
+ await func()
+ env._set("NODE_ENV", nodeEnv)
+ env._set("JEST_WORKER_ID", workerId)
}
diff --git a/packages/server/src/automations/automationUtils.js b/packages/server/src/automations/automationUtils.js
index aab341a1f8..9360840efd 100644
--- a/packages/server/src/automations/automationUtils.js
+++ b/packages/server/src/automations/automationUtils.js
@@ -53,13 +53,12 @@ exports.cleanInputValues = (inputs, schema) => {
* the automation but is instead part of the Table/Table. This function will get the table schema and use it to instead
* perform the cleanInputValues function on the input row.
*
- * @param {string} appId The instance which the Table/Table is contained under.
* @param {string} tableId The ID of the Table/Table which the schema is to be retrieved for.
* @param {object} row The input row structure which requires clean-up after having been through template statements.
* @returns {Promise} The cleaned up rows object, will should now have all the required primitive types.
*/
-exports.cleanUpRow = async (appId, tableId, row) => {
- let table = await getTable(appId, tableId)
+exports.cleanUpRow = async (tableId, row) => {
+ let table = await getTable(tableId)
return exports.cleanInputValues(row, { properties: table.schema })
}
diff --git a/packages/server/src/automations/steps/createRow.js b/packages/server/src/automations/steps/createRow.js
index 1937121062..a16521d25d 100644
--- a/packages/server/src/automations/steps/createRow.js
+++ b/packages/server/src/automations/steps/createRow.js
@@ -78,7 +78,6 @@ exports.run = async function ({ inputs, appId, emitter }) {
try {
inputs.row = await automationUtils.cleanUpRow(
- appId,
inputs.row.tableId,
inputs.row
)
diff --git a/packages/server/src/automations/steps/updateRow.js b/packages/server/src/automations/steps/updateRow.js
index a9569932fa..f66fcf9432 100644
--- a/packages/server/src/automations/steps/updateRow.js
+++ b/packages/server/src/automations/steps/updateRow.js
@@ -87,7 +87,7 @@ exports.run = async function ({ inputs, appId, emitter }) {
try {
if (tableId) {
- inputs.row = await automationUtils.cleanUpRow(appId, tableId, inputs.row)
+ inputs.row = await automationUtils.cleanUpRow(tableId, inputs.row)
}
await rowController.patch(ctx)
return {
diff --git a/packages/server/src/automations/triggers.js b/packages/server/src/automations/triggers.js
index 49e50ec34f..deff9f7503 100644
--- a/packages/server/src/automations/triggers.js
+++ b/packages/server/src/automations/triggers.js
@@ -1,4 +1,3 @@
-const CouchDB = require("../db")
const emitter = require("../events/index")
const { getAutomationParams } = require("../db/utils")
const { coerce } = require("../utilities/rowProcessor")
@@ -9,6 +8,7 @@ const { queue } = require("./bullboard")
const { checkTestFlag } = require("../utilities/redis")
const utils = require("./utils")
const env = require("../environment")
+const { doInAppContext, getAppDB } = require("@budibase/backend-core/context")
const TRIGGER_DEFINITIONS = definitions
const JOB_OPTS = {
@@ -21,39 +21,41 @@ async function queueRelevantRowAutomations(event, eventType) {
throw `No appId specified for ${eventType} - check event emitters.`
}
- const db = new CouchDB(event.appId)
- let automations = await db.allDocs(
- getAutomationParams(null, { include_docs: true })
- )
+ doInAppContext(event.appId, async () => {
+ const db = getAppDB()
+ let automations = await db.allDocs(
+ getAutomationParams(null, { include_docs: true })
+ )
- // filter down to the correct event type
- automations = automations.rows
- .map(automation => automation.doc)
- .filter(automation => {
- const trigger = automation.definition.trigger
- return trigger && trigger.event === eventType
- })
+ // filter down to the correct event type
+ automations = automations.rows
+ .map(automation => automation.doc)
+ .filter(automation => {
+ const trigger = automation.definition.trigger
+ return trigger && trigger.event === eventType
+ })
- for (let automation of automations) {
- let automationDef = automation.definition
- let automationTrigger = automationDef ? automationDef.trigger : {}
- // don't queue events which are for dev apps, only way to test automations is
- // running tests on them, in production the test flag will never
- // be checked due to lazy evaluation (first always false)
- if (
- !env.ALLOW_DEV_AUTOMATIONS &&
- isDevAppID(event.appId) &&
- !(await checkTestFlag(automation._id))
- ) {
- continue
+ for (let automation of automations) {
+ let automationDef = automation.definition
+ let automationTrigger = automationDef ? automationDef.trigger : {}
+ // don't queue events which are for dev apps, only way to test automations is
+ // running tests on them, in production the test flag will never
+ // be checked due to lazy evaluation (first always false)
+ if (
+ !env.ALLOW_DEV_AUTOMATIONS &&
+ isDevAppID(event.appId) &&
+ !(await checkTestFlag(automation._id))
+ ) {
+ continue
+ }
+ if (
+ automationTrigger.inputs &&
+ automationTrigger.inputs.tableId === event.row.tableId
+ ) {
+ await queue.add({ automation, event }, JOB_OPTS)
+ }
}
- if (
- automationTrigger.inputs &&
- automationTrigger.inputs.tableId === event.row.tableId
- ) {
- await queue.add({ automation, event }, JOB_OPTS)
- }
- }
+ })
}
emitter.on("row:save", async function (event) {
diff --git a/packages/server/src/automations/utils.js b/packages/server/src/automations/utils.js
index 4a554793f8..3ee1f535c7 100644
--- a/packages/server/src/automations/utils.js
+++ b/packages/server/src/automations/utils.js
@@ -6,8 +6,9 @@ const { queue } = require("./bullboard")
const newid = require("../db/newid")
const { updateEntityMetadata } = require("../utilities")
const { MetadataTypes } = require("../constants")
-const { getDeployedAppID } = require("@budibase/backend-core/db")
+const { getProdAppID } = require("@budibase/backend-core/db")
const { cloneDeep } = require("lodash/fp")
+const { getAppDB, getAppId } = require("@budibase/backend-core/context")
const WH_STEP_ID = definitions.WEBHOOK.stepId
const CRON_STEP_ID = definitions.CRON.stepId
@@ -27,7 +28,6 @@ exports.processEvent = async job => {
exports.updateTestHistory = async (appId, automation, history) => {
return updateEntityMetadata(
- appId,
MetadataTypes.AUTOMATION_TEST_HISTORY,
automation._id,
metadata => {
@@ -93,6 +93,9 @@ exports.enableCronTrigger = async (appId, automation) => {
)
// Assign cron job ID from bull so we can remove it later if the cron trigger is removed
trigger.cronJobId = job.id
+ // can't use getAppDB here as this is likely to be called from dev app,
+ // but this call could be for dev app or prod app, need to just use what
+ // was passed in
const db = new CouchDB(appId)
const response = await db.put(automation)
automation._id = response.id
@@ -109,7 +112,8 @@ exports.enableCronTrigger = async (appId, automation) => {
* @returns {Promise} After this is complete the new automation object may have been updated and should be
* written to DB (this does not write to DB as it would be wasteful to repeat).
*/
-exports.checkForWebhooks = async ({ appId, oldAuto, newAuto }) => {
+exports.checkForWebhooks = async ({ oldAuto, newAuto }) => {
+ const appId = getAppId()
const oldTrigger = oldAuto ? oldAuto.definition.trigger : null
const newTrigger = newAuto ? newAuto.definition.trigger : null
const triggerChanged =
@@ -128,7 +132,7 @@ exports.checkForWebhooks = async ({ appId, oldAuto, newAuto }) => {
oldTrigger.webhookId
) {
try {
- let db = new CouchDB(appId)
+ let db = getAppDB()
// need to get the webhook to get the rev
const webhook = await db.get(oldTrigger.webhookId)
const ctx = {
@@ -166,7 +170,7 @@ exports.checkForWebhooks = async ({ appId, oldAuto, newAuto }) => {
// the app ID has to be development for this endpoint
// it can only be used when building the app
// but the trigger endpoint will always be used in production
- const prodAppId = getDeployedAppID(appId)
+ const prodAppId = getProdAppID(appId)
newTrigger.inputs = {
schemaUrl: `api/webhooks/schema/${appId}/${id}`,
triggerUrl: `api/webhooks/trigger/${prodAppId}/${id}`,
diff --git a/packages/server/src/db/linkedRows/LinkController.js b/packages/server/src/db/linkedRows/LinkController.js
index b66e2debb5..86c32bf94f 100644
--- a/packages/server/src/db/linkedRows/LinkController.js
+++ b/packages/server/src/db/linkedRows/LinkController.js
@@ -1,4 +1,3 @@
-const CouchDB = require("../index")
const { IncludeDocs, getLinkDocuments } = require("./linkUtils")
const {
generateLinkID,
@@ -7,6 +6,7 @@ const {
} = require("../utils")
const Sentry = require("@sentry/node")
const { FieldTypes, RelationshipTypes } = require("../../constants")
+const { getAppDB } = require("@budibase/backend-core/context")
/**
* Creates a new link document structure which can be put to the database. It is important to
@@ -52,9 +52,8 @@ function LinkDocument(
}
class LinkController {
- constructor({ appId, tableId, row, table, oldTable }) {
- this._appId = appId
- this._db = new CouchDB(appId)
+ constructor({ tableId, row, table, oldTable }) {
+ this._db = getAppDB()
this._tableId = tableId
this._row = row
this._table = table
@@ -99,7 +98,6 @@ class LinkController {
*/
getRowLinkDocs(rowId) {
return getLinkDocuments({
- appId: this._appId,
tableId: this._tableId,
rowId,
includeDocs: IncludeDocs.INCLUDE,
@@ -111,7 +109,6 @@ class LinkController {
*/
getTableLinkDocs() {
return getLinkDocuments({
- appId: this._appId,
tableId: this._tableId,
includeDocs: IncludeDocs.INCLUDE,
})
@@ -230,7 +227,6 @@ class LinkController {
if (linkedSchema.relationshipType === RelationshipTypes.ONE_TO_MANY) {
let links = (
await getLinkDocuments({
- appId: this._appId,
tableId: field.tableId,
rowId: linkId,
includeDocs: IncludeDocs.EXCLUDE,
diff --git a/packages/server/src/db/linkedRows/index.js b/packages/server/src/db/linkedRows/index.js
index eab287aa33..6cb45f9781 100644
--- a/packages/server/src/db/linkedRows/index.js
+++ b/packages/server/src/db/linkedRows/index.js
@@ -9,12 +9,12 @@ const {
getLinkedTable,
} = require("./linkUtils")
const { flatten } = require("lodash")
-const CouchDB = require("../../db")
const { FieldTypes } = require("../../constants")
const { getMultiIDParams, USER_METDATA_PREFIX } = require("../../db/utils")
const { partition } = require("lodash")
const { getGlobalUsersFromMetadata } = require("../../utilities/global")
const { processFormulas } = require("../../utilities/rowProcessor/utils")
+const { getAppDB } = require("@budibase/backend-core/context")
/**
* This functionality makes sure that when rows with links are created, updated or deleted they are processed
@@ -48,14 +48,13 @@ function clearRelationshipFields(table, rows) {
return rows
}
-async function getLinksForRows(appId, rows) {
+async function getLinksForRows(rows) {
const tableIds = [...new Set(rows.map(el => el.tableId))]
// start by getting all the link values for performance reasons
const responses = flatten(
await Promise.all(
tableIds.map(tableId =>
getLinkDocuments({
- appId,
tableId: tableId,
includeDocs: IncludeDocs.EXCLUDE,
})
@@ -72,9 +71,9 @@ async function getLinksForRows(appId, rows) {
)
}
-async function getFullLinkedDocs(appId, links) {
+async function getFullLinkedDocs(links) {
// create DBs
- const db = new CouchDB(appId)
+ const db = getAppDB()
const linkedRowIds = links.map(link => link.id)
const uniqueRowIds = [...new Set(linkedRowIds)]
let dbRows = (await db.allDocs(getMultiIDParams(uniqueRowIds))).rows.map(
@@ -88,7 +87,7 @@ async function getFullLinkedDocs(appId, links) {
let [users, other] = partition(linked, linkRow =>
linkRow._id.startsWith(USER_METDATA_PREFIX)
)
- users = await getGlobalUsersFromMetadata(appId, users)
+ users = await getGlobalUsersFromMetadata(users)
return [...other, ...users]
}
@@ -96,20 +95,16 @@ async function getFullLinkedDocs(appId, links) {
* Update link documents for a row or table - this is to be called by the API controller when a change is occurring.
* @param {string} args.eventType states what type of change which is occurring, means this can be expanded upon in the
* future quite easily (all updates go through one function).
- * @param {string} args.appId The ID of the instance in which the change is occurring.
* @param {string} args.tableId The ID of the of the table which is being changed.
- * @param {object|null} args.row The row which is changing, e.g. created, updated or deleted.
- * @param {object|null} args.table If the table has already been retrieved this can be used to reduce database gets.
- * @param {object|null} args.oldTable If the table is being updated then the old table can be provided for differencing.
+ * @param {object|undefined} args.row The row which is changing, e.g. created, updated or deleted.
+ * @param {object|undefined} args.table If the table has already been retrieved this can be used to reduce database gets.
+ * @param {object|undefined} args.oldTable If the table is being updated then the old table can be provided for differencing.
* @returns {Promise} When the update is complete this will respond successfully. Returns the row for
* row operations and the table for table operations.
*/
exports.updateLinks = async function (args) {
- const { eventType, appId, row, tableId, table, oldTable } = args
+ const { eventType, row, tableId, table, oldTable } = args
const baseReturnObj = row == null ? table : row
- if (appId == null) {
- throw "Cannot operate without an instance ID."
- }
// make sure table ID is set
if (tableId == null && table != null) {
args.tableId = table._id
@@ -146,26 +141,23 @@ exports.updateLinks = async function (args) {
/**
* Given a table and a list of rows this will retrieve all of the attached docs and enrich them into the row.
* This is required for formula fields, this may only be utilised internally (for now).
- * @param {string} appId The ID of the app which this request is in the context of.
* @param {object} table The table from which the rows originated.
* @param {array} rows The rows which are to be enriched.
* @return {Promise<*>} returns the rows with all of the enriched relationships on it.
*/
-exports.attachFullLinkedDocs = async (appId, table, rows) => {
+exports.attachFullLinkedDocs = async (table, rows) => {
const linkedTableIds = getLinkedTableIDs(table)
if (linkedTableIds.length === 0) {
return rows
}
- // create DBs
- const db = new CouchDB(appId)
// get all the links
- const links = (await getLinksForRows(appId, rows)).filter(link =>
+ const links = (await getLinksForRows(rows)).filter(link =>
rows.some(row => row._id === link.thisId)
)
// clear any existing links that could be dupe'd
rows = clearRelationshipFields(table, rows)
// now get the docs and combine into the rows
- let linked = await getFullLinkedDocs(appId, links)
+ let linked = await getFullLinkedDocs(links)
const linkedTables = []
for (let row of rows) {
for (let link of links.filter(link => link.thisId === row._id)) {
@@ -176,11 +168,7 @@ exports.attachFullLinkedDocs = async (appId, table, rows) => {
if (linkedRow) {
const linkedTableId =
linkedRow.tableId || getRelatedTableForField(table, link.fieldName)
- const linkedTable = await getLinkedTable(
- db,
- linkedTableId,
- linkedTables
- )
+ const linkedTable = await getLinkedTable(linkedTableId, linkedTables)
if (linkedTable) {
row[link.fieldName].push(processFormulas(linkedTable, linkedRow))
}
@@ -192,18 +180,16 @@ exports.attachFullLinkedDocs = async (appId, table, rows) => {
/**
* This function will take the given enriched rows and squash the links to only contain the primary display field.
- * @param {string} appId The app in which the tables/rows/links exist.
* @param {object} table The table from which the rows originated.
* @param {array} enriched The pre-enriched rows (full docs) which are to be squashed.
* @returns {Promise} The rows after having their links squashed to only contain the ID and primary display.
*/
-exports.squashLinksToPrimaryDisplay = async (appId, table, enriched) => {
- const db = new CouchDB(appId)
+exports.squashLinksToPrimaryDisplay = async (table, enriched) => {
// will populate this as we find them
const linkedTables = [table]
for (let row of enriched) {
// this only fetches the table if its not already in array
- const rowTable = await getLinkedTable(db, row.tableId, linkedTables)
+ const rowTable = await getLinkedTable(row.tableId, linkedTables)
for (let [column, schema] of Object.entries(rowTable.schema)) {
if (schema.type !== FieldTypes.LINK || !Array.isArray(row[column])) {
continue
@@ -211,7 +197,7 @@ exports.squashLinksToPrimaryDisplay = async (appId, table, enriched) => {
const newLinks = []
for (let link of row[column]) {
const linkTblId = link.tableId || getRelatedTableForField(table, column)
- const linkedTable = await getLinkedTable(db, linkTblId, linkedTables)
+ const linkedTable = await getLinkedTable(linkTblId, linkedTables)
const obj = { _id: link._id }
if (link[linkedTable.primaryDisplay]) {
obj.primaryDisplay = link[linkedTable.primaryDisplay]
diff --git a/packages/server/src/db/linkedRows/linkUtils.js b/packages/server/src/db/linkedRows/linkUtils.js
index 12e72af78d..5af4aa919a 100644
--- a/packages/server/src/db/linkedRows/linkUtils.js
+++ b/packages/server/src/db/linkedRows/linkUtils.js
@@ -1,8 +1,8 @@
-const CouchDB = require("../index")
const Sentry = require("@sentry/node")
const { ViewNames, getQueryIndex } = require("../utils")
const { FieldTypes } = require("../../constants")
const { createLinkView } = require("../views/staticViews")
+const { getAppDB } = require("@budibase/backend-core/context")
/**
* Only needed so that boolean parameters are being used for includeDocs
@@ -17,7 +17,6 @@ exports.createLinkView = createLinkView
/**
* Gets the linking documents, not the linked documents themselves.
- * @param {string} args.appId The instance in which we are searching for linked rows.
* @param {string} args.tableId The table which we are searching for linked rows against.
* @param {string|null} args.fieldName The name of column/field which is being altered, only looking for
* linking documents that are related to it. If this is not specified then the table level will be assumed.
@@ -30,8 +29,8 @@ exports.createLinkView = createLinkView
* (if any).
*/
exports.getLinkDocuments = async function (args) {
- const { appId, tableId, rowId, includeDocs } = args
- const db = new CouchDB(appId)
+ const { tableId, rowId, includeDocs } = args
+ const db = getAppDB()
let params
if (rowId != null) {
params = { key: [tableId, rowId] }
@@ -68,7 +67,7 @@ exports.getLinkDocuments = async function (args) {
} catch (err) {
// check if the view doesn't exist, it should for all new instances
if (err != null && err.name === "not_found") {
- await exports.createLinkView(appId)
+ await exports.createLinkView()
return exports.getLinkDocuments(arguments[0])
} else {
/* istanbul ignore next */
@@ -89,7 +88,8 @@ exports.getLinkedTableIDs = table => {
.map(column => column.tableId)
}
-exports.getLinkedTable = async (db, id, tables) => {
+exports.getLinkedTable = async (id, tables) => {
+ const db = getAppDB()
let linkedTable = tables.find(table => table._id === id)
if (linkedTable) {
return linkedTable
diff --git a/packages/server/src/db/tests/linkController.spec.js b/packages/server/src/db/tests/linkController.spec.js
index d45bd99ea2..180cc2b3a0 100644
--- a/packages/server/src/db/tests/linkController.spec.js
+++ b/packages/server/src/db/tests/linkController.spec.js
@@ -20,7 +20,6 @@ describe("test the link controller", () => {
function createLinkController(table, row = null, oldTable = null) {
const linkConfig = {
- appId: config.getAppId(),
tableId: table._id,
table,
}
diff --git a/packages/server/src/db/tests/linkTests.spec.js b/packages/server/src/db/tests/linkTests.spec.js
index 8dad7be049..9a309df70a 100644
--- a/packages/server/src/db/tests/linkTests.spec.js
+++ b/packages/server/src/db/tests/linkTests.spec.js
@@ -1,8 +1,8 @@
const TestConfig = require("../../tests/utilities/TestConfiguration")
-const { basicTable, basicLinkedRow } = require("../../tests/utilities/structures")
+const { basicTable } = require("../../tests/utilities/structures")
const linkUtils = require("../linkedRows/linkUtils")
-const links = require("../linkedRows")
const CouchDB = require("../index")
+const { getAppDB } = require("@budibase/backend-core/context")
describe("test link functionality", () => {
const config = new TestConfig(false)
@@ -11,18 +11,18 @@ describe("test link functionality", () => {
let db, table
beforeEach(async () => {
await config.init()
- db = new CouchDB(config.getAppId())
+ db = getAppDB()
table = await config.createTable()
})
it("should be able to retrieve a linked table from a list", async () => {
- const retrieved = await linkUtils.getLinkedTable(db, table._id, [table])
+ const retrieved = await linkUtils.getLinkedTable(table._id, [table])
expect(retrieved._id).toBe(table._id)
})
it("should be able to retrieve a table from DB and update list", async () => {
const tables = []
- const retrieved = await linkUtils.getLinkedTable(db, table._id, tables)
+ const retrieved = await linkUtils.getLinkedTable(table._id, tables)
expect(retrieved._id).toBe(table._id)
expect(tables[0]).toBeDefined()
})
@@ -51,7 +51,6 @@ describe("test link functionality", () => {
const db = new CouchDB("test")
await db.put({ _id: "_design/database", views: {} })
const output = await linkUtils.getLinkDocuments({
- appId: "test",
tableId: "test",
rowId: "test",
includeDocs: false,
diff --git a/packages/server/src/db/views/staticViews.js b/packages/server/src/db/views/staticViews.js
index 8e7b101ef5..50b7c305d3 100644
--- a/packages/server/src/db/views/staticViews.js
+++ b/packages/server/src/db/views/staticViews.js
@@ -1,4 +1,4 @@
-const CouchDB = require("../index")
+const { getAppDB } = require("@budibase/backend-core/context")
const {
DocumentTypes,
SEPARATOR,
@@ -21,12 +21,11 @@ const SCREEN_PREFIX = DocumentTypes.SCREEN + SEPARATOR
/**
* Creates the link view for the instance, this will overwrite the existing one, but this should only
* be called if it is found that the view does not exist.
- * @param {string} appId The instance to which the view should be added.
* @returns {Promise} The view now exists, please note that the next view of this query will actually build it,
* so it may be slow.
*/
-exports.createLinkView = async appId => {
- const db = new CouchDB(appId)
+exports.createLinkView = async () => {
+ const db = getAppDB()
const designDoc = await db.get("_design/database")
const view = {
map: function (doc) {
@@ -57,8 +56,8 @@ exports.createLinkView = async appId => {
await db.put(designDoc)
}
-exports.createRoutingView = async appId => {
- const db = new CouchDB(appId)
+exports.createRoutingView = async () => {
+ const db = getAppDB()
const designDoc = await db.get("_design/database")
const view = {
// if using variables in a map function need to inject them before use
@@ -78,8 +77,8 @@ exports.createRoutingView = async appId => {
await db.put(designDoc)
}
-async function searchIndex(appId, indexName, fnString) {
- const db = new CouchDB(appId)
+async function searchIndex(indexName, fnString) {
+ const db = getAppDB()
const designDoc = await db.get("_design/database")
designDoc.indexes = {
[indexName]: {
@@ -90,9 +89,8 @@ async function searchIndex(appId, indexName, fnString) {
await db.put(designDoc)
}
-exports.createAllSearchIndex = async appId => {
+exports.createAllSearchIndex = async () => {
await searchIndex(
- appId,
SearchIndexes.ROWS,
function (doc) {
function idx(input, prev) {
diff --git a/packages/server/src/environment.js b/packages/server/src/environment.js
index 99343937d9..7ed8b16b6f 100644
--- a/packages/server/src/environment.js
+++ b/packages/server/src/environment.js
@@ -2,7 +2,8 @@ function isTest() {
return (
process.env.NODE_ENV === "jest" ||
process.env.NODE_ENV === "cypress" ||
- process.env.JEST_WORKER_ID != null
+ (process.env.JEST_WORKER_ID != null &&
+ process.env.JEST_WORKER_ID !== "null")
)
}
diff --git a/packages/server/src/integrations/base/sql.ts b/packages/server/src/integrations/base/sql.ts
index c1641e8626..a4220565cf 100644
--- a/packages/server/src/integrations/base/sql.ts
+++ b/packages/server/src/integrations/base/sql.ts
@@ -166,15 +166,13 @@ class InternalBuilder {
addSorting(query: KnexQuery, json: QueryJson): KnexQuery {
let { sort, paginate } = json
- if (!sort) {
- return query
- }
const table = json.meta?.table
- for (let [key, value] of Object.entries(sort)) {
- const direction = value === SortDirection.ASCENDING ? "asc" : "desc"
- query = query.orderBy(`${table?.name}.${key}`, direction)
- }
- if (this.client === SqlClients.MS_SQL && !sort && paginate?.limit) {
+ if (sort) {
+ for (let [key, value] of Object.entries(sort)) {
+ const direction = value === SortDirection.ASCENDING ? "asc" : "desc"
+ query = query.orderBy(`${table?.name}.${key}`, direction)
+ }
+ } else if (this.client === SqlClients.MS_SQL && paginate?.limit) {
// @ts-ignore
query = query.orderBy(`${table?.name}.${table?.primary[0]}`)
}
@@ -191,29 +189,70 @@ class InternalBuilder {
if (!relationships) {
return query
}
+ const tableSets: Record = {}
+ // aggregate into table sets (all the same to tables)
for (let relationship of relationships) {
- const from = relationship.from,
- to = relationship.to,
- toTable = relationship.tableName
- if (!relationship.through) {
+ const keyObj: { toTable: string; throughTable: string | undefined } = {
+ toTable: relationship.tableName,
+ throughTable: undefined,
+ }
+ if (relationship.through) {
+ keyObj.throughTable = relationship.through
+ }
+ const key = JSON.stringify(keyObj)
+ if (tableSets[key]) {
+ tableSets[key].push(relationship)
+ } else {
+ tableSets[key] = [relationship]
+ }
+ }
+ for (let [key, relationships] of Object.entries(tableSets)) {
+ const { toTable, throughTable } = JSON.parse(key)
+ if (!throughTable) {
// @ts-ignore
- query = query.leftJoin(
+ query = query.join(
toTable,
- `${fromTable}.${from}`,
- `${toTable}.${to}`
+ function () {
+ for (let relationship of relationships) {
+ const from = relationship.from,
+ to = relationship.to
+ // @ts-ignore
+ this.orOn(`${fromTable}.${from}`, "=", `${toTable}.${to}`)
+ }
+ },
+ "left"
)
} else {
- const throughTable = relationship.through
- const fromPrimary = relationship.fromPrimary
- const toPrimary = relationship.toPrimary
query = query
// @ts-ignore
- .leftJoin(
+ .join(
throughTable,
- `${fromTable}.${fromPrimary}`,
- `${throughTable}.${from}`
+ function () {
+ for (let relationship of relationships) {
+ const fromPrimary = relationship.fromPrimary
+ const from = relationship.from
+ // @ts-ignore
+ this.orOn(
+ `${fromTable}.${fromPrimary}`,
+ "=",
+ `${throughTable}.${from}`
+ )
+ }
+ },
+ "left"
+ )
+ .join(
+ toTable,
+ function () {
+ for (let relationship of relationships) {
+ const toPrimary = relationship.toPrimary
+ const to = relationship.to
+ // @ts-ignore
+ this.orOn(`${toTable}.${toPrimary}`, `${throughTable}.${to}`)
+ }
+ },
+ "left"
)
- .leftJoin(toTable, `${toTable}.${toPrimary}`, `${throughTable}.${to}`)
}
}
return query.limit(BASE_LIMIT)
diff --git a/packages/server/src/integrations/utils.ts b/packages/server/src/integrations/utils.ts
index 8fe8fedcc8..1341f5abca 100644
--- a/packages/server/src/integrations/utils.ts
+++ b/packages/server/src/integrations/utils.ts
@@ -52,7 +52,10 @@ export function buildExternalTableId(datasourceId: string, tableName: string) {
return `${datasourceId}${DOUBLE_SEPARATOR}${tableName}`
}
-export function breakExternalTableId(tableId: string) {
+export function breakExternalTableId(tableId: string | undefined) {
+ if (!tableId) {
+ return {}
+ }
const parts = tableId.split(DOUBLE_SEPARATOR)
let tableName = parts.pop()
// if they need joined
diff --git a/packages/server/src/middleware/authorized.js b/packages/server/src/middleware/authorized.js
index 7125ec3246..c8d6497ca3 100644
--- a/packages/server/src/middleware/authorized.js
+++ b/packages/server/src/middleware/authorized.js
@@ -10,6 +10,7 @@ const {
const builderMiddleware = require("./builder")
const { isWebhookEndpoint } = require("./utils")
const { buildCsrfMiddleware } = require("@budibase/backend-core/auth")
+const { getAppId } = require("@budibase/backend-core/context")
function hasResource(ctx) {
return ctx.resourceId != null
@@ -45,7 +46,7 @@ const checkAuthorizedResource = async (
) => {
// get the user's roles
const roleId = ctx.roleId || BUILTIN_ROLE_IDS.PUBLIC
- const userRoles = await getUserRoleHierarchy(ctx.appId, roleId, {
+ const userRoles = await getUserRoleHierarchy(roleId, {
idOnly: false,
})
const permError = "User does not have permission"
@@ -81,8 +82,9 @@ module.exports =
// get the resource roles
let resourceRoles = []
- if (ctx.appId && hasResource(ctx)) {
- resourceRoles = await getRequiredResourceRole(ctx.appId, permLevel, ctx)
+ const appId = getAppId()
+ if (appId && hasResource(ctx)) {
+ resourceRoles = await getRequiredResourceRole(permLevel, ctx)
}
// if the resource is public, proceed
diff --git a/packages/server/src/middleware/builder.js b/packages/server/src/middleware/builder.js
index d2a8ee80f0..a6404780ff 100644
--- a/packages/server/src/middleware/builder.js
+++ b/packages/server/src/middleware/builder.js
@@ -5,7 +5,7 @@ const {
checkDebounce,
setDebounce,
} = require("../utilities/redis")
-const CouchDB = require("../db")
+const { getDB } = require("@budibase/backend-core/db")
const { DocumentTypes } = require("../db/utils")
const { PermissionTypes } = require("@budibase/backend-core/permissions")
const { app: appCache } = require("@budibase/backend-core/cache")
@@ -48,7 +48,7 @@ async function updateAppUpdatedAt(ctx) {
if (ctx.method === "GET" || (await checkDebounce(appId))) {
return
}
- const db = new CouchDB(appId)
+ const db = getDB(appId)
const metadata = await db.get(DocumentTypes.APP_METADATA)
metadata.updatedAt = new Date().toISOString()
const response = await db.put(metadata)
diff --git a/packages/server/src/middleware/currentapp.js b/packages/server/src/middleware/currentapp.js
index 69f80c895b..70dd1bf578 100644
--- a/packages/server/src/middleware/currentapp.js
+++ b/packages/server/src/middleware/currentapp.js
@@ -11,9 +11,9 @@ const { generateUserMetadataID, isDevAppID } = require("../db/utils")
const { dbExists } = require("@budibase/backend-core/db")
const { isUserInAppTenant } = require("@budibase/backend-core/tenancy")
const { getCachedSelf } = require("../utilities/global")
-const CouchDB = require("../db")
const env = require("../environment")
const { isWebhookEndpoint } = require("./utils")
+const { doInAppContext } = require("@budibase/backend-core/context")
module.exports = async (ctx, next) => {
// try to get the appID from the request
@@ -31,7 +31,7 @@ module.exports = async (ctx, next) => {
// check the app exists referenced in cookie
if (appCookie) {
const appId = appCookie.appId
- const exists = await dbExists(CouchDB, appId)
+ const exists = await dbExists(appId)
if (!exists) {
clearCookie(ctx, Cookies.CurrentApp)
return next()
@@ -41,13 +41,15 @@ module.exports = async (ctx, next) => {
}
// deny access to application preview
- if (
- isDevAppID(requestAppId) &&
- !isWebhookEndpoint(ctx) &&
- (!ctx.user || !ctx.user.builder || !ctx.user.builder.global)
- ) {
- clearCookie(ctx, Cookies.CurrentApp)
- return ctx.redirect("/")
+ if (!env.isTest()) {
+ if (
+ isDevAppID(requestAppId) &&
+ !isWebhookEndpoint(ctx) &&
+ (!ctx.user || !ctx.user.builder || !ctx.user.builder.global)
+ ) {
+ clearCookie(ctx, Cookies.CurrentApp)
+ return ctx.redirect("/")
+ }
}
let appId,
@@ -68,44 +70,46 @@ module.exports = async (ctx, next) => {
return next()
}
- let noCookieSet = false
- // if the user not in the right tenant then make sure they have no permissions
- // need to judge this only based on the request app ID,
- if (
- env.MULTI_TENANCY &&
- ctx.user &&
- requestAppId &&
- !isUserInAppTenant(requestAppId)
- ) {
- // don't error, simply remove the users rights (they are a public user)
- delete ctx.user.builder
- delete ctx.user.admin
- delete ctx.user.roles
- roleId = BUILTIN_ROLE_IDS.PUBLIC
- noCookieSet = true
- }
-
- ctx.appId = appId
- if (roleId) {
- ctx.roleId = roleId
- const userId = ctx.user ? generateUserMetadataID(ctx.user._id) : null
- ctx.user = {
- ...ctx.user,
- // override userID with metadata one
- _id: userId,
- userId,
- roleId,
- role: await getRole(appId, roleId),
+ return doInAppContext(appId, async () => {
+ let noCookieSet = false
+ // if the user not in the right tenant then make sure they have no permissions
+ // need to judge this only based on the request app ID,
+ if (
+ env.MULTI_TENANCY &&
+ ctx.user &&
+ requestAppId &&
+ !isUserInAppTenant(requestAppId)
+ ) {
+ // don't error, simply remove the users rights (they are a public user)
+ delete ctx.user.builder
+ delete ctx.user.admin
+ delete ctx.user.roles
+ roleId = BUILTIN_ROLE_IDS.PUBLIC
+ noCookieSet = true
}
- }
- if (
- (requestAppId !== appId ||
- appCookie == null ||
- appCookie.appId !== requestAppId) &&
- !noCookieSet
- ) {
- setCookie(ctx, { appId }, Cookies.CurrentApp)
- }
- return next()
+ ctx.appId = appId
+ if (roleId) {
+ ctx.roleId = roleId
+ const userId = ctx.user ? generateUserMetadataID(ctx.user._id) : null
+ ctx.user = {
+ ...ctx.user,
+ // override userID with metadata one
+ _id: userId,
+ userId,
+ roleId,
+ role: await getRole(roleId),
+ }
+ }
+ if (
+ (requestAppId !== appId ||
+ appCookie == null ||
+ appCookie.appId !== requestAppId) &&
+ !noCookieSet
+ ) {
+ setCookie(ctx, { appId }, Cookies.CurrentApp)
+ }
+
+ return next()
+ })
}
diff --git a/packages/server/src/middleware/tests/authorized.spec.js b/packages/server/src/middleware/tests/authorized.spec.js
index 04ef6e2b07..9cfa9d368f 100644
--- a/packages/server/src/middleware/tests/authorized.spec.js
+++ b/packages/server/src/middleware/tests/authorized.spec.js
@@ -11,6 +11,9 @@ const authorizedMiddleware = require("../authorized")
const env = require("../../environment")
const { PermissionTypes, PermissionLevels } = require("@budibase/backend-core/permissions")
require("@budibase/backend-core").init(require("../../db"))
+const { doInAppContext } = require("@budibase/backend-core/context")
+
+const APP_ID = ""
class TestConfiguration {
constructor(role) {
@@ -23,7 +26,7 @@ class TestConfiguration {
request: {
url: ""
},
- appId: "",
+ appId: APP_ID,
auth: {},
next: this.next,
throw: this.throw,
@@ -32,7 +35,9 @@ class TestConfiguration {
}
executeMiddleware() {
- return this.middleware(this.ctx, this.next)
+ return doInAppContext(APP_ID, () => {
+ return this.middleware(this.ctx, this.next)
+ })
}
setUser(user) {
diff --git a/packages/server/src/middleware/tests/currentapp.spec.js b/packages/server/src/middleware/tests/currentapp.spec.js
index 27c88f3b48..4e53a6a4c0 100644
--- a/packages/server/src/middleware/tests/currentapp.spec.js
+++ b/packages/server/src/middleware/tests/currentapp.spec.js
@@ -1,6 +1,11 @@
mockAuthWithNoCookie()
mockWorker()
+jest.mock("@budibase/backend-core/db", () => ({
+ ...jest.requireActual("@budibase/backend-core/db"),
+ dbExists: () => true,
+}))
+
function mockWorker() {
jest.mock("../../utilities/workerRequests", () => ({
getGlobalSelf: () => {
@@ -50,6 +55,7 @@ function mockAuthWithCookie() {
return "app_test"
},
setCookie: jest.fn(),
+ clearCookie: jest.fn(),
getCookie: () => ({appId: "app_different", roleId: "PUBLIC"}),
}))
jest.mock("@budibase/backend-core/constants", () => ({
diff --git a/packages/server/src/middleware/usageQuota.js b/packages/server/src/middleware/usageQuota.js
index 2cd0836113..d8f028de3a 100644
--- a/packages/server/src/middleware/usageQuota.js
+++ b/packages/server/src/middleware/usageQuota.js
@@ -1,10 +1,10 @@
-const CouchDB = require("../db")
const usageQuota = require("../utilities/usageQuota")
const { getUniqueRows } = require("../utilities/usageQuota/rows")
const {
isExternalTable,
isRowId: isExternalRowId,
} = require("../integrations/utils")
+const { getAppDB } = require("@budibase/backend-core/context")
// currently only counting new writes and deletes
const METHOD_MAP = {
@@ -46,7 +46,7 @@ module.exports = async (ctx, next) => {
const usageId = ctx.request.body._id
try {
if (ctx.appId) {
- const db = new CouchDB(ctx.appId)
+ const db = getAppDB()
await db.get(usageId)
}
return next()
diff --git a/packages/server/src/migrations/functions/usageQuotas/syncApps.ts b/packages/server/src/migrations/functions/usageQuotas/syncApps.ts
index 0fba4f0f7f..aec5541053 100644
--- a/packages/server/src/migrations/functions/usageQuotas/syncApps.ts
+++ b/packages/server/src/migrations/functions/usageQuotas/syncApps.ts
@@ -1,12 +1,12 @@
-const { getGlobalDB, getTenantId } = require("@budibase/backend-core/tenancy")
-const { getAllApps } = require("@budibase/backend-core/db")
-import CouchDB from "../../../db"
+import { getGlobalDB, getTenantId } from "@budibase/backend-core/tenancy"
+import { getAllApps } from "@budibase/backend-core/db"
import { getUsageQuotaDoc } from "../../../utilities/usageQuota"
export const run = async () => {
const db = getGlobalDB()
// get app count
- const devApps = await getAllApps(CouchDB, { dev: true })
+ // @ts-ignore
+ const devApps = await getAllApps({ dev: true })
const appCount = devApps ? devApps.length : 0
// sync app count
diff --git a/packages/server/src/migrations/functions/usageQuotas/syncRows.ts b/packages/server/src/migrations/functions/usageQuotas/syncRows.ts
index 58767d0c0a..2766a7c0d1 100644
--- a/packages/server/src/migrations/functions/usageQuotas/syncRows.ts
+++ b/packages/server/src/migrations/functions/usageQuotas/syncRows.ts
@@ -1,13 +1,14 @@
-const { getGlobalDB, getTenantId } = require("@budibase/backend-core/tenancy")
-const { getAllApps } = require("@budibase/backend-core/db")
-import CouchDB from "../../../db"
+import { getGlobalDB, getTenantId } from "@budibase/backend-core/tenancy"
+import { getAllApps } from "@budibase/backend-core/db"
import { getUsageQuotaDoc } from "../../../utilities/usageQuota"
import { getUniqueRows } from "../../../utilities/usageQuota/rows"
export const run = async () => {
const db = getGlobalDB()
// get all rows in all apps
- const allApps = await getAllApps(CouchDB, { all: true })
+ // @ts-ignore
+ const allApps = await getAllApps({ all: true })
+ // @ts-ignore
const appIds = allApps ? allApps.map((app: { appId: any }) => app.appId) : []
const rows = await getUniqueRows(appIds)
const rowCount = rows ? rows.length : 0
diff --git a/packages/server/src/module.d.ts b/packages/server/src/module.d.ts
new file mode 100644
index 0000000000..b7850efff3
--- /dev/null
+++ b/packages/server/src/module.d.ts
@@ -0,0 +1,3 @@
+declare module "@budibase/backend-core"
+declare module "@budibase/backend-core/tenancy"
+declare module "@budibase/backend-core/db"
diff --git a/packages/server/src/tests/utilities/TestConfiguration.js b/packages/server/src/tests/utilities/TestConfiguration.js
index 68aa68dc66..6c2b7d4f98 100644
--- a/packages/server/src/tests/utilities/TestConfiguration.js
+++ b/packages/server/src/tests/utilities/TestConfiguration.js
@@ -1,3 +1,6 @@
+const core = require("@budibase/backend-core")
+const CouchDB = require("../../db")
+core.init(CouchDB)
const { BUILTIN_ROLE_IDS } = require("@budibase/backend-core/roles")
const env = require("../../environment")
const {
@@ -17,13 +20,11 @@ const supertest = require("supertest")
const { cleanup } = require("../../utilities/fileSystem")
const { Cookies, Headers } = require("@budibase/backend-core/constants")
const { jwt } = require("@budibase/backend-core/auth")
-const core = require("@budibase/backend-core")
const { getGlobalDB } = require("@budibase/backend-core/tenancy")
const { createASession } = require("@budibase/backend-core/sessions")
const { user: userCache } = require("@budibase/backend-core/cache")
-const CouchDB = require("../../db")
const newid = require("../../db/newid")
-core.init(CouchDB)
+const context = require("@budibase/backend-core/context")
const GLOBAL_USER_ID = "us_uuid1"
const EMAIL = "babs@babs.com"
@@ -65,11 +66,21 @@ class TestConfiguration {
request.request = {
body: config,
}
- if (params) {
- request.params = params
+ async function run() {
+ if (params) {
+ request.params = params
+ }
+ await controlFunc(request)
+ return request.body
+ }
+ // check if already in a context
+ if (context.getAppId() == null) {
+ return context.doInAppContext(this.appId, async () => {
+ return run()
+ })
+ } else {
+ return run()
}
- await controlFunc(request)
- return request.body
}
async globalUser({
@@ -175,6 +186,7 @@ class TestConfiguration {
// create dev app
this.app = await this._req({ name: appName }, null, controllers.app.create)
this.appId = this.app.appId
+ context.updateAppId(this.appId)
// create production app
this.prodApp = await this.deploy()
@@ -187,14 +199,16 @@ class TestConfiguration {
}
async deploy() {
- const deployment = await this._req(null, null, controllers.deploy.deployApp)
- const prodAppId = deployment.appId.replace("_dev", "")
- const appPackage = await this._req(
- null,
- { appId: prodAppId },
- controllers.app.fetchAppPackage
- )
- return appPackage.application
+ await this._req(null, null, controllers.deploy.deployApp)
+ const prodAppId = this.getAppId().replace("_dev", "")
+ return context.doInAppContext(prodAppId, async () => {
+ const appPackage = await this._req(
+ null,
+ { appId: prodAppId },
+ controllers.app.fetchAppPackage
+ )
+ return appPackage.application
+ })
}
async updateTable(config = null) {
@@ -423,42 +437,47 @@ class TestConfiguration {
async login({ roleId, userId, builder, prodApp = false } = {}) {
const appId = prodApp ? this.prodAppId : this.appId
-
- userId = !userId ? `us_uuid1` : userId
- if (!this.request) {
- throw "Server has not been opened, cannot login."
- }
- // make sure the user exists in the global DB
- if (roleId !== BUILTIN_ROLE_IDS.PUBLIC) {
- await this.globalUser({
- userId,
- builder,
- roles: { [this.prodAppId]: roleId },
+ return context.doInAppContext(appId, async () => {
+ userId = !userId ? `us_uuid1` : userId
+ if (!this.request) {
+ throw "Server has not been opened, cannot login."
+ }
+ // make sure the user exists in the global DB
+ if (roleId !== BUILTIN_ROLE_IDS.PUBLIC) {
+ await this.globalUser({
+ id: userId,
+ builder,
+ roles: { [this.prodAppId]: roleId },
+ })
+ }
+ await createASession(userId, {
+ sessionId: "sessionid",
+ tenantId: TENANT_ID,
})
- }
- // have to fake this
- const auth = {
- userId,
- sessionId: "sessionid",
- tenantId: TENANT_ID,
- }
- const app = {
- roleId: roleId,
- appId,
- }
- const authToken = jwt.sign(auth, env.JWT_SECRET)
- const appToken = jwt.sign(app, env.JWT_SECRET)
+ // have to fake this
+ const auth = {
+ userId,
+ sessionId: "sessionid",
+ tenantId: TENANT_ID,
+ }
+ const app = {
+ roleId: roleId,
+ appId,
+ }
+ const authToken = jwt.sign(auth, env.JWT_SECRET)
+ const appToken = jwt.sign(app, env.JWT_SECRET)
- // returning necessary request headers
- await userCache.invalidateUser(userId)
- return {
- Accept: "application/json",
- Cookie: [
- `${Cookies.Auth}=${authToken}`,
- `${Cookies.CurrentApp}=${appToken}`,
- ],
- [Headers.APP_ID]: appId,
- }
+ // returning necessary request headers
+ await userCache.invalidateUser(userId)
+ return {
+ Accept: "application/json",
+ Cookie: [
+ `${Cookies.Auth}=${authToken}`,
+ `${Cookies.CurrentApp}=${appToken}`,
+ ],
+ [Headers.APP_ID]: appId,
+ }
+ })
}
}
diff --git a/packages/server/src/threads/automation.js b/packages/server/src/threads/automation.js
index 2a39773520..c0843a286c 100644
--- a/packages/server/src/threads/automation.js
+++ b/packages/server/src/threads/automation.js
@@ -5,11 +5,11 @@ const automationUtils = require("../automations/automationUtils")
const AutomationEmitter = require("../events/AutomationEmitter")
const { processObject } = require("@budibase/string-templates")
const { DEFAULT_TENANT_ID } = require("@budibase/backend-core/constants")
-const CouchDB = require("../db")
const { DocumentTypes, isDevAppID } = require("../db/utils")
const { doInTenant } = require("@budibase/backend-core/tenancy")
const usage = require("../utilities/usageQuota")
const { definitions: triggerDefs } = require("../automations/triggerInfo")
+const { doInAppContext, getAppDB } = require("@budibase/backend-core/context")
const FILTER_STEP_ID = actions.ACTION_DEFINITIONS.FILTER.stepId
const CRON_STEP_ID = triggerDefs.CRON.stepId
@@ -59,11 +59,10 @@ class Orchestrator {
}
async getApp() {
- const appId = this._appId
if (this._app) {
return this._app
}
- const db = new CouchDB(appId)
+ const db = getAppDB()
this._app = await db.get(DocumentTypes.APP_METADATA)
return this._app
}
@@ -131,16 +130,19 @@ class Orchestrator {
}
module.exports = (input, callback) => {
- const automationOrchestrator = new Orchestrator(
- input.data.automation,
- input.data.event
- )
- automationOrchestrator
- .execute()
- .then(response => {
- callback(null, response)
- })
- .catch(err => {
- callback(err)
- })
+ const appId = input.data.event.appId
+ doInAppContext(appId, () => {
+ const automationOrchestrator = new Orchestrator(
+ input.data.automation,
+ input.data.event
+ )
+ automationOrchestrator
+ .execute()
+ .then(response => {
+ callback(null, response)
+ })
+ .catch(err => {
+ callback(err)
+ })
+ })
}
diff --git a/packages/server/src/threads/query.js b/packages/server/src/threads/query.js
index ff3e101d48..5b1a30b57d 100644
--- a/packages/server/src/threads/query.js
+++ b/packages/server/src/threads/query.js
@@ -3,14 +3,10 @@ threadUtils.threadSetup()
const ScriptRunner = require("../utilities/scriptRunner")
const { integrations } = require("../integrations")
const { processStringSync } = require("@budibase/string-templates")
-const CouchDB = require("../db")
-
-const IS_TRIPLE_BRACE = new RegExp(/^{{3}.*}{3}$/)
-const IS_HANDLEBARS = new RegExp(/^{{2}.*}{2}$/)
+const { doInAppContext, getAppDB } = require("@budibase/backend-core/context")
class QueryRunner {
constructor(input, flags = { noRecursiveQuery: false }) {
- this.appId = input.appId
this.datasource = input.datasource
this.queryVerb = input.queryVerb
this.fields = input.fields
@@ -104,12 +100,11 @@ class QueryRunner {
}
async runAnotherQuery(queryId, parameters) {
- const db = new CouchDB(this.appId)
+ const db = getAppDB()
const query = await db.get(queryId)
const datasource = await db.get(query.datasourceId)
return new QueryRunner(
{
- appId: this.appId,
datasource,
queryVerb: query.queryVerb,
fields: query.fields,
@@ -166,10 +161,16 @@ class QueryRunner {
const responses = await Promise.all(dynamics)
for (let i = 0; i < foundVars.length; i++) {
const variable = foundVars[i]
- parameters[variable.name] = processStringSync(variable.value, {
- data: responses[i].rows,
- info: responses[i].extra,
- })
+ parameters[variable.name] = processStringSync(
+ variable.value,
+ {
+ data: responses[i].rows,
+ info: responses[i].extra,
+ },
+ {
+ escapeNewlines: true,
+ }
+ )
// make sure its known that this uses dynamic variables in case it fails
this.hasDynamicVariables = true
}
@@ -190,13 +191,10 @@ class QueryRunner {
enrichedQuery[key] = this.enrichQueryFields(fields[key], parameters)
} else if (typeof fields[key] === "string") {
// enrich string value as normal
- let value = fields[key]
- // add triple brace to avoid escaping e.g. '=' in cookie header
- if (IS_HANDLEBARS.test(value) && !IS_TRIPLE_BRACE.test(value)) {
- value = `{${value}}`
- }
- enrichedQuery[key] = processStringSync(value, parameters, {
+ enrichedQuery[key] = processStringSync(fields[key], parameters, {
+ noEscaping: true,
noHelpers: true,
+ escapeNewlines: true,
})
} else {
enrichedQuery[key] = fields[key]
@@ -223,12 +221,14 @@ class QueryRunner {
}
module.exports = (input, callback) => {
- const Runner = new QueryRunner(input)
- Runner.execute()
- .then(response => {
- callback(null, response)
- })
- .catch(err => {
- callback(err)
- })
+ doInAppContext(input.appId, () => {
+ const Runner = new QueryRunner(input)
+ Runner.execute()
+ .then(response => {
+ callback(null, response)
+ })
+ .catch(err => {
+ callback(err)
+ })
+ })
}
diff --git a/packages/server/src/utilities/fileSystem/index.js b/packages/server/src/utilities/fileSystem/index.js
index b8ddb1a356..904b4ced18 100644
--- a/packages/server/src/utilities/fileSystem/index.js
+++ b/packages/server/src/utilities/fileSystem/index.js
@@ -1,5 +1,4 @@
const { budibaseTempDir } = require("../budibaseDir")
-const { isDev } = require("../index")
const fs = require("fs")
const { join } = require("path")
const uuid = require("uuid/v4")
@@ -20,6 +19,7 @@ const {
LINK_USER_METADATA_PREFIX,
} = require("../../db/utils")
const MemoryStream = require("memorystream")
+const { getAppId } = require("@budibase/backend-core/context")
const TOP_LEVEL_PATH = join(__dirname, "..", "..", "..")
const NODE_MODULES_PATH = join(TOP_LEVEL_PATH, "node_modules")
@@ -51,7 +51,7 @@ exports.init = () => {
* everything required to function is ready.
*/
exports.checkDevelopmentEnvironment = () => {
- if (!isDev()) {
+ if (!env.isDev() || env.isTest()) {
return
}
if (!fs.existsSync(budibaseTempDir())) {
@@ -251,7 +251,8 @@ exports.downloadTemplate = async (type, name) => {
/**
* Retrieves component libraries from object store (or tmp symlink if in local)
*/
-exports.getComponentLibraryManifest = async (appId, library) => {
+exports.getComponentLibraryManifest = async library => {
+ const appId = getAppId()
const filename = "manifest.json"
/* istanbul ignore next */
// when testing in cypress and so on we need to get the package
diff --git a/packages/server/src/utilities/global.js b/packages/server/src/utilities/global.js
index 7ef1c09405..f8ec5ea647 100644
--- a/packages/server/src/utilities/global.js
+++ b/packages/server/src/utilities/global.js
@@ -3,7 +3,7 @@ const {
getGlobalIDFromUserMetadataID,
} = require("../db/utils")
const { BUILTIN_ROLE_IDS } = require("@budibase/backend-core/roles")
-const { getDeployedAppID } = require("@budibase/backend-core/db")
+const { getProdAppID } = require("@budibase/backend-core/db")
const { getGlobalUserParams } = require("@budibase/backend-core/db")
const { user: userCache } = require("@budibase/backend-core/cache")
const {
@@ -11,8 +11,10 @@ const {
isUserInAppTenant,
} = require("@budibase/backend-core/tenancy")
const env = require("../environment")
+const { getAppId } = require("@budibase/backend-core/context")
-exports.updateAppRole = (appId, user) => {
+exports.updateAppRole = (user, { appId } = {}) => {
+ appId = appId || getAppId()
if (!user || !user.roles) {
return user
}
@@ -24,7 +26,7 @@ exports.updateAppRole = (appId, user) => {
return user
}
// always use the deployed app
- user.roleId = user.roles[getDeployedAppID(appId)]
+ user.roleId = user.roles[getProdAppID(appId)]
// if a role wasn't found then either set as admin (builder) or public (everyone else)
if (!user.roleId && user.builder && user.builder.global) {
user.roleId = BUILTIN_ROLE_IDS.ADMIN
@@ -35,18 +37,18 @@ exports.updateAppRole = (appId, user) => {
return user
}
-function processUser(appId, user) {
+function processUser(user, { appId } = {}) {
if (user) {
delete user.password
}
- return exports.updateAppRole(appId, user)
+ return exports.updateAppRole(user, { appId })
}
exports.getCachedSelf = async (ctx, appId) => {
// this has to be tenant aware, can't depend on the context to find it out
// running some middlewares before the tenancy causes context to break
const user = await userCache.getUser(ctx.user._id)
- return processUser(appId, user)
+ return processUser(user, { appId })
}
exports.getRawGlobalUser = async userId => {
@@ -54,12 +56,13 @@ exports.getRawGlobalUser = async userId => {
return db.get(getGlobalIDFromUserMetadataID(userId))
}
-exports.getGlobalUser = async (appId, userId) => {
+exports.getGlobalUser = async userId => {
let user = await exports.getRawGlobalUser(userId)
- return processUser(appId, user)
+ return processUser(user)
}
-exports.getGlobalUsers = async (appId = null, users = null) => {
+exports.getGlobalUsers = async (users = null) => {
+ const appId = getAppId()
const db = getGlobalDB()
let globalUsers
if (users) {
@@ -86,11 +89,11 @@ exports.getGlobalUsers = async (appId = null, users = null) => {
if (!appId) {
return globalUsers
}
- return globalUsers.map(user => exports.updateAppRole(appId, user))
+ return globalUsers.map(user => exports.updateAppRole(user))
}
-exports.getGlobalUsersFromMetadata = async (appId, users) => {
- const globalUsers = await exports.getGlobalUsers(appId, users)
+exports.getGlobalUsersFromMetadata = async users => {
+ const globalUsers = await exports.getGlobalUsers(users)
return users.map(user => {
const globalUser = globalUsers.find(
globalUser => globalUser && user._id.includes(globalUser._id)
diff --git a/packages/server/src/utilities/index.js b/packages/server/src/utilities/index.js
index 0dba11141c..d1e277541a 100644
--- a/packages/server/src/utilities/index.js
+++ b/packages/server/src/utilities/index.js
@@ -1,9 +1,9 @@
const env = require("../environment")
const { OBJ_STORE_DIRECTORY } = require("../constants")
const { sanitizeKey } = require("@budibase/backend-core/objectStore")
-const CouchDB = require("../db")
const { generateMetadataID } = require("../db/utils")
const Readable = require("stream").Readable
+const { getAppDB } = require("@budibase/backend-core/context")
const BB_CDN = "https://cdn.budi.live"
@@ -73,8 +73,8 @@ exports.attachmentsRelativeURL = attachmentKey => {
)
}
-exports.updateEntityMetadata = async (appId, type, entityId, updateFn) => {
- const db = new CouchDB(appId)
+exports.updateEntityMetadata = async (type, entityId, updateFn) => {
+ const db = getAppDB()
const id = generateMetadataID(type, entityId)
// read it to see if it exists, we'll overwrite it no matter what
let rev,
@@ -99,14 +99,14 @@ exports.updateEntityMetadata = async (appId, type, entityId, updateFn) => {
}
}
-exports.saveEntityMetadata = async (appId, type, entityId, metadata) => {
- return exports.updateEntityMetadata(appId, type, entityId, () => {
+exports.saveEntityMetadata = async (type, entityId, metadata) => {
+ return exports.updateEntityMetadata(type, entityId, () => {
return metadata
})
}
-exports.deleteEntityMetadata = async (appId, type, entityId) => {
- const db = new CouchDB(appId)
+exports.deleteEntityMetadata = async (type, entityId) => {
+ const db = getAppDB()
const id = generateMetadataID(type, entityId)
let rev
try {
@@ -141,16 +141,6 @@ exports.stringToReadStream = string => {
})
}
-exports.doesDatabaseExist = async dbName => {
- try {
- const db = new CouchDB(dbName, { skip_setup: true })
- const info = await db.info()
- return info && !info.error
- } catch (err) {
- return false
- }
-}
-
exports.formatBytes = bytes => {
const units = ["B", "KB", "MB", "GB", "TB", "PB", "EB", "ZB", "YB"]
const byteIncrements = 1024
diff --git a/packages/server/src/utilities/routing/index.js b/packages/server/src/utilities/routing/index.js
index 541733dcc4..b68001c3c3 100644
--- a/packages/server/src/utilities/routing/index.js
+++ b/packages/server/src/utilities/routing/index.js
@@ -1,9 +1,9 @@
-const CouchDB = require("../../db")
const { createRoutingView } = require("../../db/views/staticViews")
const { ViewNames, getQueryIndex, UNICODE_MAX } = require("../../db/utils")
+const { getAppDB } = require("@budibase/backend-core/context")
-exports.getRoutingInfo = async appId => {
- const db = new CouchDB(appId)
+exports.getRoutingInfo = async () => {
+ const db = getAppDB()
try {
const allRouting = await db.query(getQueryIndex(ViewNames.ROUTING), {
startKey: "",
@@ -14,8 +14,8 @@ exports.getRoutingInfo = async appId => {
// check if the view doesn't exist, it should for all new instances
/* istanbul ignore next */
if (err != null && err.name === "not_found") {
- await createRoutingView(appId)
- return exports.getRoutingInfo(appId)
+ await createRoutingView()
+ return exports.getRoutingInfo()
} else {
throw err
}
diff --git a/packages/server/src/utilities/rowProcessor/index.js b/packages/server/src/utilities/rowProcessor/index.js
index dc56312d63..18e0b14de6 100644
--- a/packages/server/src/utilities/rowProcessor/index.js
+++ b/packages/server/src/utilities/rowProcessor/index.js
@@ -7,10 +7,10 @@ const { deleteFiles } = require("../../utilities/fileSystem/utilities")
const { ObjectStoreBuckets } = require("../../constants")
const {
isProdAppID,
- getDeployedAppID,
+ getProdAppID,
dbExists,
} = require("@budibase/backend-core/db")
-const CouchDB = require("../../db")
+const { getAppId } = require("@budibase/backend-core/context")
const BASE_AUTO_ID = 1
@@ -253,26 +253,20 @@ exports.inputProcessing = (
/**
* This function enriches the input rows with anything they are supposed to contain, for example
* link records or attachment links.
- * @param {string} appId the app in which the request is looking for enriched rows.
* @param {object} table the table from which these rows came from originally, this is used to determine
* the schema of the rows and then enrich.
* @param {object[]|object} rows the rows which are to be enriched.
* @param {object} opts used to set some options for the output, such as disabling relationship squashing.
* @returns {object[]|object} the enriched rows will be returned.
*/
-exports.outputProcessing = async (
- { appId },
- table,
- rows,
- opts = { squash: true }
-) => {
+exports.outputProcessing = async (table, rows, opts = { squash: true }) => {
let wasArray = true
if (!(rows instanceof Array)) {
rows = [rows]
wasArray = false
}
// attach any linked row information
- let enriched = await linkRows.attachFullLinkedDocs(appId, table, rows)
+ let enriched = await linkRows.attachFullLinkedDocs(table, rows)
// process formulas
enriched = processFormulas(table, enriched, { dynamic: true })
@@ -291,18 +285,13 @@ exports.outputProcessing = async (
}
}
if (opts.squash) {
- enriched = await linkRows.squashLinksToPrimaryDisplay(
- appId,
- table,
- enriched
- )
+ enriched = await linkRows.squashLinksToPrimaryDisplay(table, enriched)
}
return wasArray ? enriched : enriched[0]
}
/**
* Clean up any attachments that were attached to a row.
- * @param {string} appId The ID of the app from which a row is being deleted.
* @param {object} table The table from which a row is being removed.
* @param {any} row optional - the row being removed.
* @param {any} rows optional - if multiple rows being deleted can do this in bulk.
@@ -311,15 +300,12 @@ exports.outputProcessing = async (
* deleted attachment columns.
* @return {Promise} When all attachments have been removed this will return.
*/
-exports.cleanupAttachments = async (
- appId,
- table,
- { row, rows, oldRow, oldTable }
-) => {
+exports.cleanupAttachments = async (table, { row, rows, oldRow, oldTable }) => {
+ const appId = getAppId()
if (!isProdAppID(appId)) {
- const prodAppId = getDeployedAppID(appId)
+ const prodAppId = getProdAppID(appId)
// if prod exists, then don't allow deleting
- const exists = await dbExists(CouchDB, prodAppId)
+ const exists = await dbExists(prodAppId)
if (exists) {
return
}
diff --git a/packages/server/src/utilities/usageQuota/index.js b/packages/server/src/utilities/usageQuota/index.js
index b0ff310aa3..e27877b977 100644
--- a/packages/server/src/utilities/usageQuota/index.js
+++ b/packages/server/src/utilities/usageQuota/index.js
@@ -52,6 +52,7 @@ exports.getUsageQuotaDoc = async db => {
* Given a specified tenantId this will add to the usage object for the specified property.
* @param {string} property The property which is to be added to (within the nested usageQuota object).
* @param {number} usage The amount (this can be negative) to adjust the number by.
+ * @param {object} opts optional - options such as dryRun, to check what update will do.
* @returns {Promise} When this completes the API key will now be up to date - the quota period may have
* also been reset after this call.
*/
diff --git a/packages/server/src/utilities/usageQuota/rows.js b/packages/server/src/utilities/usageQuota/rows.js
index 67ad07410d..378caffc46 100644
--- a/packages/server/src/utilities/usageQuota/rows.js
+++ b/packages/server/src/utilities/usageQuota/rows.js
@@ -23,6 +23,7 @@ const getAppPairs = appIds => {
}
const getAppRows = async appId => {
+ // need to specify the app ID, as this is used for different apps in one call
const appDb = new CouchDB(appId)
const response = await appDb.allDocs(
getRowParams(null, null, {
diff --git a/packages/server/src/utilities/users.js b/packages/server/src/utilities/users.js
index 6144397bf1..b3601986d8 100644
--- a/packages/server/src/utilities/users.js
+++ b/packages/server/src/utilities/users.js
@@ -1,13 +1,13 @@
-const CouchDB = require("../db")
const { InternalTables } = require("../db/utils")
const { getGlobalUser } = require("../utilities/global")
+const { getAppDB } = require("@budibase/backend-core/context")
exports.getFullUser = async (ctx, userId) => {
- const global = await getGlobalUser(ctx.appId, userId)
+ const global = await getGlobalUser(userId)
let metadata
try {
// this will throw an error if the db doesn't exist, or there is no appId
- const db = new CouchDB(ctx.appId)
+ const db = getAppDB()
metadata = await db.get(userId)
} catch (err) {
// it is fine if there is no user metadata, just remove global db info
diff --git a/packages/server/src/utilities/workerRequests.js b/packages/server/src/utilities/workerRequests.js
index 5e46f1678f..91db63d2a4 100644
--- a/packages/server/src/utilities/workerRequests.js
+++ b/packages/server/src/utilities/workerRequests.js
@@ -1,7 +1,7 @@
const fetch = require("node-fetch")
const env = require("../environment")
const { checkSlashesInUrl } = require("./index")
-const { getDeployedAppID } = require("@budibase/backend-core/db")
+const { getProdAppID } = require("@budibase/backend-core/db")
const { updateAppRole } = require("./global")
const { Headers } = require("@budibase/backend-core/constants")
const { getTenantId, isTenantIdSet } = require("@budibase/backend-core/tenancy")
@@ -70,15 +70,15 @@ exports.getGlobalSelf = async (ctx, appId = null) => {
}
let json = await response.json()
if (appId) {
- json = updateAppRole(appId, json)
+ json = updateAppRole(json)
}
return json
}
exports.removeAppFromUserRoles = async (ctx, appId) => {
- const deployedAppId = getDeployedAppID(appId)
+ const prodAppId = getProdAppID(appId)
const response = await fetch(
- checkSlashesInUrl(env.WORKER_URL + `/api/global/roles/${deployedAppId}`),
+ checkSlashesInUrl(env.WORKER_URL + `/api/global/roles/${prodAppId}`),
request(ctx, {
method: "DELETE",
})
diff --git a/packages/server/tsconfig.json b/packages/server/tsconfig.json
index 6a5ba315a1..16ea88108e 100644
--- a/packages/server/tsconfig.json
+++ b/packages/server/tsconfig.json
@@ -9,10 +9,12 @@
"noImplicitAny": true,
"esModuleInterop": true,
"resolveJsonModule": true,
- "incremental": true
+ "incremental": true,
+ "types": [ "node", "jest"],
},
"include": [
- "./src/**/*"
+ "./src/**/*",
+ "./src/module.d.ts"
],
"exclude": [
"node_modules",
diff --git a/packages/server/yarn.lock b/packages/server/yarn.lock
index c8697bb9ec..2856286d01 100644
--- a/packages/server/yarn.lock
+++ b/packages/server/yarn.lock
@@ -983,10 +983,10 @@
resolved "https://registry.yarnpkg.com/@bcoe/v8-coverage/-/v8-coverage-0.2.3.tgz#75a2e8b51cb758a7553d6804a5932d7aace75c39"
integrity sha512-0hYQ8SB4Db5zvZB4axdMHGwEaQjkZzFjQiN9LVYvIFB2nSUHW9tYpxWriPrWDASIxiaXax83REcLxuSdnGPZtw==
-"@budibase/backend-core@^1.0.49-alpha.5":
- version "1.0.49-alpha.5"
- resolved "https://registry.yarnpkg.com/@budibase/backend-core/-/backend-core-1.0.49-alpha.5.tgz#53b36098981863fbc8d9e780b6ce9ca9e25c679d"
- integrity sha512-To/kIv7ClSeb7FuLFcLQ0s7Al2oOlVWPtXoHO2RVSJx6YSnrxzecrz5Di0OyXiSVldSzgcFA/hCAcw6cs/on9g==
+"@budibase/backend-core@^1.0.50-alpha.6":
+ version "1.0.56"
+ resolved "https://registry.yarnpkg.com/@budibase/backend-core/-/backend-core-1.0.56.tgz#898ad4df1d923527cb340ae76ec548d1e5234755"
+ integrity sha512-Cos2TgI6grgSTiLPGlAZb53BEWPrzU3e9UXhhTtwiROraYxkVQob6kOQMK7Cvnm3IC+viXjDRijysvactlRwFA==
dependencies:
"@techpass/passport-openidconnect" "^0.3.0"
aws-sdk "^2.901.0"
@@ -1056,26 +1056,65 @@
svelte-flatpickr "^3.2.3"
svelte-portal "^1.0.0"
-"@budibase/bbui@^1.0.49-alpha.5":
- version "1.58.13"
- resolved "https://registry.yarnpkg.com/@budibase/bbui/-/bbui-1.58.13.tgz#59df9c73def2d81c75dcbd2266c52c19db88dbd7"
- integrity sha512-Zk6CKXdBfKsTVzA1Xs5++shdSSZLfphVpZuKVbjfzkgtuhyH7ruucexuSHEpFsxjW5rEKgKIBoRFzCK5vPvN0w==
+"@budibase/bbui@^1.0.56":
+ version "1.0.56"
+ resolved "https://registry.yarnpkg.com/@budibase/bbui/-/bbui-1.0.56.tgz#3c8cd78c97a21a34fc4b9a45483c7eca739a1dd8"
+ integrity sha512-HNcQFSFyvtQBtDvqLhQwUCGVJsvNmRt5GZsgIZr73oKaowiC6cwBQm/uPg3I9GxqrTz6etPy0E5ELm/cZKptxA==
dependencies:
- markdown-it "^12.0.2"
- quill "^1.3.7"
- sirv-cli "^0.4.6"
- svelte-flatpickr "^2.4.0"
+ "@adobe/spectrum-css-workflow-icons" "^1.2.1"
+ "@spectrum-css/actionbutton" "^1.0.1"
+ "@spectrum-css/actiongroup" "^1.0.1"
+ "@spectrum-css/avatar" "^3.0.2"
+ "@spectrum-css/button" "^3.0.1"
+ "@spectrum-css/buttongroup" "^3.0.2"
+ "@spectrum-css/checkbox" "^3.0.2"
+ "@spectrum-css/dialog" "^3.0.1"
+ "@spectrum-css/divider" "^1.0.3"
+ "@spectrum-css/dropzone" "^3.0.2"
+ "@spectrum-css/fieldgroup" "^3.0.2"
+ "@spectrum-css/fieldlabel" "^3.0.1"
+ "@spectrum-css/icon" "^3.0.1"
+ "@spectrum-css/illustratedmessage" "^3.0.2"
+ "@spectrum-css/inlinealert" "^2.0.1"
+ "@spectrum-css/inputgroup" "^3.0.2"
+ "@spectrum-css/label" "^2.0.10"
+ "@spectrum-css/link" "^3.1.1"
+ "@spectrum-css/menu" "^3.0.1"
+ "@spectrum-css/modal" "^3.0.1"
+ "@spectrum-css/pagination" "^3.0.3"
+ "@spectrum-css/picker" "^1.0.1"
+ "@spectrum-css/popover" "^3.0.1"
+ "@spectrum-css/progressbar" "^1.0.2"
+ "@spectrum-css/progresscircle" "^1.0.2"
+ "@spectrum-css/radio" "^3.0.2"
+ "@spectrum-css/search" "^3.0.2"
+ "@spectrum-css/sidenav" "^3.0.2"
+ "@spectrum-css/statuslight" "^3.0.2"
+ "@spectrum-css/stepper" "^3.0.3"
+ "@spectrum-css/switch" "^1.0.2"
+ "@spectrum-css/table" "^3.0.1"
+ "@spectrum-css/tabs" "^3.0.1"
+ "@spectrum-css/tags" "^3.0.2"
+ "@spectrum-css/textfield" "^3.0.1"
+ "@spectrum-css/toast" "^3.0.1"
+ "@spectrum-css/tooltip" "^3.0.3"
+ "@spectrum-css/treeview" "^3.0.2"
+ "@spectrum-css/typography" "^3.0.1"
+ "@spectrum-css/underlay" "^2.0.9"
+ "@spectrum-css/vars" "^3.0.1"
+ dayjs "^1.10.4"
+ easymde "^2.16.1"
+ svelte-flatpickr "^3.2.3"
svelte-portal "^1.0.0"
- turndown "^7.0.0"
-"@budibase/client@^1.0.49-alpha.5":
- version "1.0.49-alpha.5"
- resolved "https://registry.yarnpkg.com/@budibase/client/-/client-1.0.49-alpha.5.tgz#6e0802870be60c067bcd787bc6857095a830bbb8"
- integrity sha512-URWoHKbhZayFZ6bEcnXcnLEHWmnw0wDfF2bj2JR+M0/CktihcXySFNKHGW5Qt4O0kK6vhnAEd7Bz1KdfTnfaBg==
+"@budibase/client@^1.0.50-alpha.6":
+ version "1.0.56"
+ resolved "https://registry.yarnpkg.com/@budibase/client/-/client-1.0.56.tgz#58376ee2d6f64d1a37e6cc8898006a74f5f810b2"
+ integrity sha512-a7D+AKzutYk3N5OlKndv4BYtt0vnkDysjNLRtkZnKj3ERwQzI3bERrs+Wp2o5KI3cbZFWYR8x179YYxjBTcnYg==
dependencies:
- "@budibase/bbui" "^1.0.49-alpha.5"
+ "@budibase/bbui" "^1.0.56"
"@budibase/standard-components" "^0.9.139"
- "@budibase/string-templates" "^1.0.49-alpha.5"
+ "@budibase/string-templates" "^1.0.56"
regexparam "^1.3.0"
rollup-plugin-polyfill-node "^0.8.0"
shortid "^2.2.15"
@@ -1124,10 +1163,10 @@
svelte-apexcharts "^1.0.2"
svelte-flatpickr "^3.1.0"
-"@budibase/string-templates@^1.0.49-alpha.5":
- version "1.0.49-alpha.5"
- resolved "https://registry.yarnpkg.com/@budibase/string-templates/-/string-templates-1.0.49-alpha.5.tgz#e49078700ab142d9755ec9581315992cdcc1675f"
- integrity sha512-fRxDY37GVbV+dcg95lAiIDLBcplYpMvv0KnME+qC0ATuV3yGIr2MCPxhK6iUBLIcFUPXJhCkGhTMxd4zNJ2yGQ==
+"@budibase/string-templates@^1.0.50-alpha.6", "@budibase/string-templates@^1.0.56":
+ version "1.0.56"
+ resolved "https://registry.yarnpkg.com/@budibase/string-templates/-/string-templates-1.0.56.tgz#13869566e344ec175904e7b535dba31f8ddd723d"
+ integrity sha512-lynmirU/3v+RPmuLGUG3SQUZ8EhjkaGsGTCtzzJq+59bR72zkpoxk+p2XbtJn5AWZvd4VN7d57w3+wGRMPTzoQ==
dependencies:
"@budibase/handlebars-helpers" "^0.11.7"
dayjs "^1.10.4"
@@ -1872,11 +1911,6 @@
"@nodelib/fs.scandir" "2.1.5"
fastq "^1.6.0"
-"@polka/url@^0.5.0":
- version "0.5.0"
- resolved "https://registry.yarnpkg.com/@polka/url/-/url-0.5.0.tgz#b21510597fd601e5d7c95008b76bf0d254ebfd31"
- integrity sha512-oZLYFEAzUKyi3SKnXvj32ZCEGH6RDnao7COuCVhDydMS9NrCSVXhM79VaKyP5+Zc33m0QXEd2DN3UkU7OsHcfw==
-
"@rollup/plugin-inject@^4.0.0":
version "4.0.4"
resolved "https://registry.yarnpkg.com/@rollup/plugin-inject/-/plugin-inject-4.0.4.tgz#fbeee66e9a700782c4f65c8b0edbafe58678fbc2"
@@ -2082,6 +2116,11 @@
resolved "https://registry.yarnpkg.com/@spectrum-css/illustratedmessage/-/illustratedmessage-3.0.8.tgz#69ef0c935bcc5027f233a78de5aeb0064bf033cb"
integrity sha512-HvC4dywDi11GdrXQDCvKQ0vFlrXLTyJuc9UKf7meQLCGoJbGYDBwe+tHXNK1c6gPMD9BoL6pPMP1K/vRzR4EBQ==
+"@spectrum-css/inlinealert@^2.0.1":
+ version "2.0.6"
+ resolved "https://registry.yarnpkg.com/@spectrum-css/inlinealert/-/inlinealert-2.0.6.tgz#4c5e923a1f56a96cc1adb30ef1f06ae04f2c6376"
+ integrity sha512-OpvvoWP02wWyCnF4IgG8SOPkXymovkC9cGtgMS1FdDubnG3tJZB/JeKTsRR9C9Vt3WBaOmISRdSKlZ4lC9CFzA==
+
"@spectrum-css/inputgroup@^3.0.2":
version "3.0.8"
resolved "https://registry.yarnpkg.com/@spectrum-css/inputgroup/-/inputgroup-3.0.8.tgz#fc23afc8a73c24d17249c9d2337e8b42085b298b"
@@ -2327,6 +2366,13 @@
dependencies:
"@types/ioredis" "*"
+"@types/codemirror@^5.60.4":
+ version "5.60.5"
+ resolved "https://registry.yarnpkg.com/@types/codemirror/-/codemirror-5.60.5.tgz#5b989a3b4bbe657458cf372c92b6bfda6061a2b7"
+ integrity sha512-TiECZmm8St5YxjFUp64LK0c8WU5bxMDt9YaAek1UqUb9swrSCoJhh92fWu1p3mTEqlHjhB5sY7OFBhWroJXZVg==
+ dependencies:
+ "@types/tern" "*"
+
"@types/connect@*":
version "3.4.35"
resolved "https://registry.yarnpkg.com/@types/connect/-/connect-3.4.35.tgz#5fcf6ae445e4021d1fc2219a4873cc73a3bb2ad1"
@@ -2503,6 +2549,11 @@
"@types/koa-compose" "*"
"@types/node" "*"
+"@types/marked@^4.0.1":
+ version "4.0.2"
+ resolved "https://registry.yarnpkg.com/@types/marked/-/marked-4.0.2.tgz#cb2dbf10da2f41cf20bd91fb5f89b67540c282f7"
+ integrity sha512-auNrZ/c0w6wsM9DccwVxWHssrMDezHUAXNesdp2RQrCVCyrQbOiSq7yqdJKrUQQpw9VTm7CGYJH2A/YG7jjrjQ==
+
"@types/mime@^1":
version "1.3.2"
resolved "https://registry.yarnpkg.com/@types/mime/-/mime-1.3.2.tgz#93e25bf9ee75fe0fd80b594bc4feb0e862111b5a"
@@ -2577,6 +2628,13 @@
resolved "https://registry.yarnpkg.com/@types/stack-utils/-/stack-utils-2.0.1.tgz#20f18294f797f2209b5f65c8e3b5c8e8261d127c"
integrity sha512-Hl219/BT5fLAaz6NDkSuhzasy49dwQS/DSdu4MdggFB8zcXv7vflBI3xp7FEmkmdDkBUI2bPUNeMttp2knYdxw==
+"@types/tern@*":
+ version "0.23.4"
+ resolved "https://registry.yarnpkg.com/@types/tern/-/tern-0.23.4.tgz#03926eb13dbeaf3ae0d390caf706b2643a0127fb"
+ integrity sha512-JAUw1iXGO1qaWwEOzxTKJZ/5JxVeON9kvGZ/osgZaJImBnyjyn0cjovPsf6FNLmyGY8Vw9DoXZCMlfMkMwHRWg==
+ dependencies:
+ "@types/estree" "*"
+
"@types/yargs-parser@*":
version "20.2.1"
resolved "https://registry.yarnpkg.com/@types/yargs-parser/-/yargs-parser-20.2.1.tgz#3b9ce2489919d9e4fea439b76916abc34b2df129"
@@ -3298,9 +3356,9 @@ aws-sdk@^2.767.0:
xml2js "0.4.19"
aws-sdk@^2.901.0:
- version "2.1066.0"
- resolved "https://registry.yarnpkg.com/aws-sdk/-/aws-sdk-2.1066.0.tgz#2a9b00d983f3c740a7adda18d4e9a5c34d4d3887"
- integrity sha512-9BZPdJgIvau8Jf2l3PxInNqQd733uKLqGGDywMV71duxNTLgdBZe2zvCkbgl22+ldC8R2LVMdS64DzchfQIxHg==
+ version "2.1070.0"
+ resolved "https://registry.yarnpkg.com/aws-sdk/-/aws-sdk-2.1070.0.tgz#e7a27c34ed3a92776aa9128ed3469cb94bba9655"
+ integrity sha512-tkmuycoJ9k0qF1iq03iqyhevxP3l0OlrnUxjd0x8nZ9Ko1TGjyj0yJS4Vbd4r5RBpKUwRqedB7TAyZ/71mcZKw==
dependencies:
buffer "4.9.2"
events "1.1.1"
@@ -4016,11 +4074,6 @@ clone-response@1.0.2, clone-response@^1.0.2:
dependencies:
mimic-response "^1.0.0"
-clone@^2.1.1:
- version "2.1.2"
- resolved "https://registry.yarnpkg.com/clone/-/clone-2.1.2.tgz#1b7f4b9f591f1e8f83670401600345a02887435f"
- integrity sha1-G39Ln1kfHo+DZwQBYANFoCiHQ18=
-
cls-hooked@^4.2.2:
version "4.2.2"
resolved "https://registry.yarnpkg.com/cls-hooked/-/cls-hooked-4.2.2.tgz#ad2e9a4092680cdaffeb2d3551da0e225eae1908"
@@ -4050,6 +4103,18 @@ co@^4.6.0:
resolved "https://registry.yarnpkg.com/co/-/co-4.6.0.tgz#6ea6bdf3d853ae54ccb8e47bfa0bf3f9031fb184"
integrity sha1-bqa989hTrlTMuOR7+gvz+QMfsYQ=
+codemirror-spell-checker@1.1.2:
+ version "1.1.2"
+ resolved "https://registry.yarnpkg.com/codemirror-spell-checker/-/codemirror-spell-checker-1.1.2.tgz#1c660f9089483ccb5113b9ba9ca19c3f4993371e"
+ integrity sha1-HGYPkIlIPMtRE7m6nKGcP0mTNx4=
+ dependencies:
+ typo-js "*"
+
+codemirror@^5.63.1:
+ version "5.65.1"
+ resolved "https://registry.yarnpkg.com/codemirror/-/codemirror-5.65.1.tgz#5988a812c974c467f964bcc1a00c944e373de502"
+ integrity sha512-s6aac+DD+4O2u1aBmdxhB7yz2XU7tG3snOyQ05Kxifahz7hoxnfxIRHxiCSEv3TUC38dIVH8G+lZH9UWSfGQxA==
+
collect-v8-coverage@^1.0.0:
version "1.0.1"
resolved "https://registry.yarnpkg.com/collect-v8-coverage/-/collect-v8-coverage-1.0.1.tgz#cc2c8e94fc18bbdffe64d6534570c8a673b27f59"
@@ -4221,11 +4286,6 @@ configstore@^5.0.1:
write-file-atomic "^3.0.0"
xdg-basedir "^4.0.0"
-console-clear@^1.1.0:
- version "1.1.1"
- resolved "https://registry.yarnpkg.com/console-clear/-/console-clear-1.1.1.tgz#995e20cbfbf14dd792b672cde387bd128d674bf7"
- integrity sha512-pMD+MVR538ipqkG5JXeOEbKWS5um1H4LUUccUQG68qpeqBYbzYy79Gh55jkd2TtPdRfUaLWdv6LPP//5Zt0aPQ==
-
consolidate@^0.16.0:
version "0.16.0"
resolved "https://registry.yarnpkg.com/consolidate/-/consolidate-0.16.0.tgz#a11864768930f2f19431660a65906668f5fbdc16"
@@ -4566,18 +4626,6 @@ dedent@^0.7.0:
resolved "https://registry.yarnpkg.com/dedent/-/dedent-0.7.0.tgz#2495ddbaf6eb874abb0e1be9df22d2e5a544326c"
integrity sha1-JJXduvbrh0q7Dhvp3yLS5aVEMmw=
-deep-equal@^1.0.1:
- version "1.1.1"
- resolved "https://registry.yarnpkg.com/deep-equal/-/deep-equal-1.1.1.tgz#b5c98c942ceffaf7cb051e24e1434a25a2e6076a"
- integrity sha512-yd9c5AdiqVcR+JjcwUQb9DkhJc8ngNr0MahEBGvDiJw8puWab2yZlh+nkasOnZP+EGTAP6rRp2JzJhJZzvNF8g==
- dependencies:
- is-arguments "^1.0.4"
- is-date-object "^1.0.1"
- is-regex "^1.0.4"
- object-is "^1.0.1"
- object-keys "^1.1.1"
- regexp.prototype.flags "^1.2.0"
-
deep-equal@~1.0.1:
version "1.0.1"
resolved "https://registry.yarnpkg.com/deep-equal/-/deep-equal-1.0.1.tgz#f5d260292b660e084eff4cdbc9f08ad3247448b5"
@@ -4788,11 +4836,6 @@ domexception@^2.0.1:
dependencies:
webidl-conversions "^5.0.0"
-domino@^2.1.6:
- version "2.1.6"
- resolved "https://registry.yarnpkg.com/domino/-/domino-2.1.6.tgz#fe4ace4310526e5e7b9d12c7de01b7f485a57ffe"
- integrity sha512-3VdM/SXBZX2omc9JF9nOPCtDaYQ67BGp5CoLpIQlO2KCAPETs8TcDHacF26jXadGbvUteZzRTeos2fhID5+ucQ==
-
dot-prop@^5.2.0:
version "5.3.0"
resolved "https://registry.yarnpkg.com/dot-prop/-/dot-prop-5.3.0.tgz#90ccce708cd9cd82cc4dc8c3ddd9abdd55b20e88"
@@ -4837,6 +4880,17 @@ duplexer3@^0.1.4:
resolved "https://registry.yarnpkg.com/duplexer3/-/duplexer3-0.1.4.tgz#ee01dd1cac0ed3cbc7fdbea37dc0a8f1ce002ce2"
integrity sha1-7gHdHKwO08vH/b6jfcCo8c4ALOI=
+easymde@^2.16.1:
+ version "2.16.1"
+ resolved "https://registry.yarnpkg.com/easymde/-/easymde-2.16.1.tgz#f4c2380312615cb33826f1a1fecfaa4022ff551a"
+ integrity sha512-FihYgjRsKfhGNk89SHSqxKLC4aJ1kfybPWW6iAmtb5GnXu+tnFPSzSaGBmk1RRlCuhFSjhF0SnIMGVPjEzkr6g==
+ dependencies:
+ "@types/codemirror" "^5.60.4"
+ "@types/marked" "^4.0.1"
+ codemirror "^5.63.1"
+ codemirror-spell-checker "1.1.2"
+ marked "^4.0.10"
+
ecc-jsbn@~0.1.1:
version "0.1.2"
resolved "https://registry.yarnpkg.com/ecc-jsbn/-/ecc-jsbn-0.1.2.tgz#3a83a904e54353287874c564b7549386849a98c9"
@@ -5381,11 +5435,6 @@ event-target-shim@^5.0.0:
resolved "https://registry.yarnpkg.com/event-target-shim/-/event-target-shim-5.0.1.tgz#5d4d3ebdf9583d63a5333ce2deb7480ab2b05789"
integrity sha512-i/2XbnSz/uxRCU6+NdVJgKWDTM427+MqYbkQzD321DuCQJUqOuJKIA0IM2+W2xtYHdKOmZ4dR6fExsd4SXL+WQ==
-eventemitter3@^2.0.3:
- version "2.0.3"
- resolved "https://registry.yarnpkg.com/eventemitter3/-/eventemitter3-2.0.3.tgz#b5e1079b59fb5e1ba2771c0a993be060a58c99ba"
- integrity sha1-teEHm1n7XhuidxwKmTvgYKWMmbo=
-
events@1.1.1:
version "1.1.1"
resolved "https://registry.yarnpkg.com/events/-/events-1.1.1.tgz#9ebdb7635ad099c70dcc4c2a1f5004288e8bd924"
@@ -5572,11 +5621,6 @@ fast-deep-equal@^3.1.1:
resolved "https://registry.yarnpkg.com/fast-deep-equal/-/fast-deep-equal-3.1.3.tgz#3a7d56b559d6cbc3eb512325244e619a65c6c525"
integrity sha512-f3qQ9oQy9j2AhBe/H9VC91wLmKBCCU/gDOnKNAYG5hswO7BLKj09Hc5HYNz9cGI++xlpDCIgDaitVs03ATR84Q==
-fast-diff@1.1.2:
- version "1.1.2"
- resolved "https://registry.yarnpkg.com/fast-diff/-/fast-diff-1.1.2.tgz#4b62c42b8e03de3f848460b639079920695d0154"
- integrity sha512-KaJUt+M9t1qaIteSvjc6P3RbMdXsNhK61GRftR6SNxqmhthcd9MGIi4T+o0jD8LUSpSnSKXE20nLtJ3fOHxQig==
-
fast-glob@^3.1.1:
version "3.2.7"
resolved "https://registry.yarnpkg.com/fast-glob/-/fast-glob-3.2.7.tgz#fd6cb7a2d7e9aa7a7846111e85a196d6b2f766a1"
@@ -6028,11 +6072,6 @@ get-paths@0.0.7:
dependencies:
pify "^4.0.1"
-get-port@^3.2.0:
- version "3.2.0"
- resolved "https://registry.yarnpkg.com/get-port/-/get-port-3.2.0.tgz#dd7ce7de187c06c8bf353796ac71e099f0980ebc"
- integrity sha1-3Xzn3hh8Bsi/NTeWrHHgmfCYDrw=
-
get-port@^5.1.1:
version "5.1.1"
resolved "https://registry.yarnpkg.com/get-port/-/get-port-5.1.1.tgz#0469ed07563479de6efb986baf053dcd7d4e3193"
@@ -6770,9 +6809,9 @@ ioredis@^4.27.0:
standard-as-callback "^2.1.0"
ioredis@^4.27.1:
- version "4.28.3"
- resolved "https://registry.yarnpkg.com/ioredis/-/ioredis-4.28.3.tgz#b13fce8a6a7c525ba22e666d72980a3c0ba799aa"
- integrity sha512-9JOWVgBnuSxpIgfpjc1OeY1OLmA4t2KOWWURTDRXky+eWO0LZhI33pQNT9gYxANUXfh5p/zYephYni6GPRsksQ==
+ version "4.28.5"
+ resolved "https://registry.yarnpkg.com/ioredis/-/ioredis-4.28.5.tgz#5c149e6a8d76a7f8fa8a504ffc85b7d5b6797f9f"
+ integrity sha512-3GYo0GJtLqgNXj4YhrisLaNNvWSNwSS2wS4OELGfGxH8I69+XfNdnmV1AyN+ZqMh0i7eX+SWjrwFKDBDgfBC1A==
dependencies:
cluster-key-slot "^1.1.0"
debug "^4.3.1"
@@ -6805,14 +6844,6 @@ is-accessor-descriptor@^1.0.0:
dependencies:
kind-of "^6.0.0"
-is-arguments@^1.0.4:
- version "1.1.1"
- resolved "https://registry.yarnpkg.com/is-arguments/-/is-arguments-1.1.1.tgz#15b3f88fda01f2a97fec84ca761a560f123efa9b"
- integrity sha512-8Q7EARjzEnKpt/PCD7e1cgUS0a6X8u5tdSiMqXhojOdoV9TsMsiO+9VLC5vAmO8N7/GmXn7yjR8qnA6bVAEzfA==
- dependencies:
- call-bind "^1.0.2"
- has-tostringtag "^1.0.0"
-
is-arrayish@^0.2.1:
version "0.2.1"
resolved "https://registry.yarnpkg.com/is-arrayish/-/is-arrayish-0.2.1.tgz#77c99840527aa8ecb1a8ba697b80645a7a926a9d"
@@ -7077,7 +7108,7 @@ is-property@^1.0.2:
resolved "https://registry.yarnpkg.com/is-property/-/is-property-1.0.2.tgz#57fe1c4e48474edd65b09911f26b1cd4095dda84"
integrity sha1-V/4cTkhHTt1lsJkR8msc1Ald2oQ=
-is-regex@^1.0.4, is-regex@^1.1.4:
+is-regex@^1.1.4:
version "1.1.4"
resolved "https://registry.yarnpkg.com/is-regex/-/is-regex-1.1.4.tgz#eef5663cd59fa4c0ae339505323df6854bb15958"
integrity sha512-kvRdxDsxZjhzUX07ZnLydzS1TU/TJlTUHHY4YLL87e37oUA49DfkLqgy+VjFocowy29cKvcSiu+kIv728jTTVg==
@@ -8361,7 +8392,7 @@ keyv@3.0.0, keyv@^3.0.0:
dependencies:
json-buffer "3.0.0"
-kind-of@^3.0.2, kind-of@^3.0.3, kind-of@^3.2.0:
+kind-of@^3.0.2, kind-of@^3.0.3, kind-of@^3.1.0, kind-of@^3.2.0:
version "3.2.2"
resolved "https://registry.yarnpkg.com/kind-of/-/kind-of-3.2.2.tgz#31ea21a734bab9bbb0f32466d893aea51e4a3c64"
integrity sha1-MeohpzS6ubuw8yRm2JOupR5KPGQ=
@@ -8392,7 +8423,7 @@ klaw-sync@^6.0.0:
dependencies:
graceful-fs "^4.1.11"
-kleur@^3.0.0, kleur@^3.0.3:
+kleur@^3.0.3:
version "3.0.3"
resolved "https://registry.yarnpkg.com/kleur/-/kleur-3.0.3.tgz#a79c9ecc86ee1ce3fa6206d1216c501f147fc07e"
integrity sha512-eTIzlVOSUR+JxdDFepEYcBMtZ9Qqdef+rnzWdRZuMbOywu5tO2w2N7rqjoANZ5k9vywhL6Br1VRjUIgTQx4E8w==
@@ -8816,11 +8847,6 @@ loader-utils@^2.0.0:
emojis-list "^3.0.0"
json5 "^2.1.2"
-local-access@^1.0.1:
- version "1.1.0"
- resolved "https://registry.yarnpkg.com/local-access/-/local-access-1.1.0.tgz#e007c76ba2ca83d5877ba1a125fc8dfe23ba4798"
- integrity sha512-XfegD5pyTAfb+GY6chk283Ox5z8WexG56OvM06RWLpAc/UHozO8X6xAxEkIitZOtsSMM1Yr3DkHgW5W+onLhCw==
-
locate-path@^3.0.0:
version "3.0.0"
resolved "https://registry.yarnpkg.com/locate-path/-/locate-path-3.0.0.tgz#dbec3b3ab759758071b58fe59fc41871af21400e"
@@ -9094,17 +9120,6 @@ map-visit@^1.0.0:
dependencies:
object-visit "^1.0.0"
-markdown-it@^12.0.2:
- version "12.3.2"
- resolved "https://registry.yarnpkg.com/markdown-it/-/markdown-it-12.3.2.tgz#bf92ac92283fe983fe4de8ff8abfb5ad72cd0c90"
- integrity sha512-TchMembfxfNVpHkbtriWltGWc+m3xszaRD0CZup7GFFhzIgQqxIfn3eGj1yZpfuflzPvfkt611B2Q/Bsk1YnGg==
- dependencies:
- argparse "^2.0.1"
- entities "~2.1.0"
- linkify-it "^3.0.1"
- mdurl "^1.0.1"
- uc.micro "^1.0.5"
-
markdown-it@^12.2.0:
version "12.2.0"
resolved "https://registry.yarnpkg.com/markdown-it/-/markdown-it-12.2.0.tgz#091f720fd5db206f80de7a8d1f1a7035fd0d38db"
@@ -9116,6 +9131,11 @@ markdown-it@^12.2.0:
mdurl "^1.0.1"
uc.micro "^1.0.5"
+marked@^4.0.10:
+ version "4.0.12"
+ resolved "https://registry.yarnpkg.com/marked/-/marked-4.0.12.tgz#2262a4e6fd1afd2f13557726238b69a48b982f7d"
+ integrity sha512-hgibXWrEDNBWgGiK18j/4lkS6ihTe9sxtV4Q1OQppb/0zzyPSzoFANBa5MfsG/zgsWklmNnhm0XACZOH/0HBiQ==
+
md5@^2.3.0:
version "2.3.0"
resolved "https://registry.yarnpkg.com/md5/-/md5-2.3.0.tgz#c3da9a6aae3a30b46b7b0c349b87b110dc3bda4f"
@@ -9245,11 +9265,6 @@ mime@^1.3.4, mime@^1.4.1:
resolved "https://registry.yarnpkg.com/mime/-/mime-1.6.0.tgz#32cd9e5c64553bd58d19a568af452acff04981b1"
integrity sha512-x0Vn8spI+wuJ1O6S7gnbaQg8Pxh4NNHb7KSINmEWKiPE4RKOplvijn+NkmYmmRgP68mc70j2EbeTFRsrswaQeg==
-mime@^2.3.1:
- version "2.6.0"
- resolved "https://registry.yarnpkg.com/mime/-/mime-2.6.0.tgz#a2a682a95cd4d0cb1d6257e28f83da7e35800367"
- integrity sha512-USPkMeET31rOMiarsBNIHZKLGgvKc/LrjofAnBlOttf5ajRvqiRA8QsenbcooctK6d6Ts6aqZXBA+XbkKthiQg==
-
mimic-fn@^2.0.0, mimic-fn@^2.1.0:
version "2.1.0"
resolved "https://registry.yarnpkg.com/mimic-fn/-/mimic-fn-2.1.0.tgz#7ed2c2ccccaf84d3ffcb7a69b57711fc2083401b"
@@ -9334,11 +9349,6 @@ mri@1.1.4:
resolved "https://registry.yarnpkg.com/mri/-/mri-1.1.4.tgz#7cb1dd1b9b40905f1fac053abe25b6720f44744a"
integrity sha512-6y7IjGPm8AzlvoUrwAaw1tLnUBudaS3752vcd8JtrpGGQn+rXIe63LFVHm/YMwtqAuh+LJPCFdlLYPWM1nYn6w==
-mri@^1.1.0:
- version "1.2.0"
- resolved "https://registry.yarnpkg.com/mri/-/mri-1.2.0.tgz#6721480fec2a11a4889861115a48b6cbe7cc8f0b"
- integrity sha512-tzzskb3bG8LvYGFF/mDTpq3jpI6Q9wc3LEmBaghu+DdCssd1FakN7Bc0hVNmEyGq1bq3RgfkCb3cmQLpNPOroA==
-
ms@2.0.0:
version "2.0.0"
resolved "https://registry.yarnpkg.com/ms/-/ms-2.0.0.tgz#5608aeadfc00be6c2901df5f9861788de0d597c8"
@@ -9661,14 +9671,6 @@ object-inspect@^1.11.0, object-inspect@^1.9.0:
resolved "https://registry.yarnpkg.com/object-inspect/-/object-inspect-1.11.0.tgz#9dceb146cedd4148a0d9e51ab88d34cf509922b1"
integrity sha512-jp7ikS6Sd3GxQfZJPyH3cjcbJF6GZPClgdV+EFygjFLQ5FmW/dRUnTd9PQ9k0JhoNDabWFbpF1yCdSWCC6gexg==
-object-is@^1.0.1:
- version "1.1.5"
- resolved "https://registry.yarnpkg.com/object-is/-/object-is-1.1.5.tgz#b9deeaa5fc7f1846a0faecdceec138e5778f53ac"
- integrity sha512-3cyDsyHgtmi7I7DfSSI2LDp6SK2lwvtbg0p0R1e0RvTqF5ceGx+K2dfSjm1bKDMVCFEDAQvy+o8c6a7VujOddw==
- dependencies:
- call-bind "^1.0.2"
- define-properties "^1.1.3"
-
object-keys@^1.0.12, object-keys@^1.0.6, object-keys@^1.1.1:
version "1.1.1"
resolved "https://registry.yarnpkg.com/object-keys/-/object-keys-1.1.1.tgz#1c47f272df277f3b1daf061677d9c82e2322c60e"
@@ -9897,11 +9899,6 @@ pako@^1.0.5:
resolved "https://registry.yarnpkg.com/pako/-/pako-1.0.11.tgz#6c9599d340d54dfd3946380252a35705a6b992bf"
integrity sha512-4hLB8Py4zZce5s4yd9XzopqwVv/yGNhV1Bl8NTmCq1763HeK2+EwVTv+leGeL13Dnh2wfbqowVPXCIO0z4taYw==
-parchment@^1.1.4:
- version "1.1.4"
- resolved "https://registry.yarnpkg.com/parchment/-/parchment-1.1.4.tgz#aeded7ab938fe921d4c34bc339ce1168bc2ffde5"
- integrity sha512-J5FBQt/pM2inLzg4hEWmzQx/8h8D0CiDxaG3vyp9rKrQRSDgBlhjdP5jQGgosEajXPSQouXGHOmVdgo7QmJuOg==
-
parent-module@^1.0.0:
version "1.0.1"
resolved "https://registry.yarnpkg.com/parent-module/-/parent-module-1.0.1.tgz#691d2709e78c79fae3a156622452d00762caaaa2"
@@ -10751,27 +10748,6 @@ quick-format-unescaped@^4.0.3:
resolved "https://registry.yarnpkg.com/quick-format-unescaped/-/quick-format-unescaped-4.0.4.tgz#93ef6dd8d3453cbc7970dd614fad4c5954d6b5a7"
integrity sha512-tYC1Q1hgyRuHgloV/YXs2w15unPVh8qfu/qCTfhTYamaw7fyhumKa2yGpdSo87vY32rIclj+4fWYQXUMs9EHvg==
-quill-delta@^3.6.2:
- version "3.6.3"
- resolved "https://registry.yarnpkg.com/quill-delta/-/quill-delta-3.6.3.tgz#b19fd2b89412301c60e1ff213d8d860eac0f1032"
- integrity sha512-wdIGBlcX13tCHOXGMVnnTVFtGRLoP0imqxM696fIPwIf5ODIYUHIvHbZcyvGlZFiFhK5XzDC2lpjbxRhnM05Tg==
- dependencies:
- deep-equal "^1.0.1"
- extend "^3.0.2"
- fast-diff "1.1.2"
-
-quill@^1.3.7:
- version "1.3.7"
- resolved "https://registry.yarnpkg.com/quill/-/quill-1.3.7.tgz#da5b2f3a2c470e932340cdbf3668c9f21f9286e8"
- integrity sha512-hG/DVzh/TiknWtE6QmWAF/pxoZKYxfe3J/d/+ShUWkDvvkZQVTPeVmUJVu1uE6DDooC4fWTiCLh84ul89oNz5g==
- dependencies:
- clone "^2.1.1"
- deep-equal "^1.0.1"
- eventemitter3 "^2.0.3"
- extend "^3.0.2"
- parchment "^1.1.4"
- quill-delta "^3.6.2"
-
randombytes@^2.1.0:
version "2.1.0"
resolved "https://registry.yarnpkg.com/randombytes/-/randombytes-2.1.0.tgz#df6f84372f0270dc65cdf6291349ab7a473d4f2a"
@@ -10987,14 +10963,6 @@ regex-not@^1.0.0, regex-not@^1.0.2:
extend-shallow "^3.0.2"
safe-regex "^1.1.0"
-regexp.prototype.flags@^1.2.0:
- version "1.4.1"
- resolved "https://registry.yarnpkg.com/regexp.prototype.flags/-/regexp.prototype.flags-1.4.1.tgz#b3f4c0059af9e47eca9f3f660e51d81307e72307"
- integrity sha512-pMR7hBVUUGI7PMA37m2ofIdQCsomVnas+Jn5UPGAHQ+/LlwKm/aTLJHdasmHRzlfeZwHiAOaRSo2rbBDm3nNUQ==
- dependencies:
- call-bind "^1.0.2"
- define-properties "^1.1.3"
-
regexparam@2.0.0:
version "2.0.0"
resolved "https://registry.yarnpkg.com/regexparam/-/regexparam-2.0.0.tgz#059476767d5f5f87f735fc7922d133fd1a118c8c"
@@ -11271,13 +11239,6 @@ rxjs@^6.6.0:
dependencies:
tslib "^1.9.0"
-sade@^1.4.0:
- version "1.8.1"
- resolved "https://registry.yarnpkg.com/sade/-/sade-1.8.1.tgz#0a78e81d658d394887be57d2a409bf703a3b2701"
- integrity sha512-xal3CZX1Xlo/k4ApwCFrHVACi9fBqJ7V+mwhBsuf/1IOKbBy098Fex+Wa/5QMubw09pSZ/u8EY8PWgevJsXp1A==
- dependencies:
- mri "^1.1.0"
-
safe-buffer@*, safe-buffer@^5.1.0, safe-buffer@^5.1.1, safe-buffer@^5.1.2, safe-buffer@~5.2.0:
version "5.2.1"
resolved "https://registry.yarnpkg.com/safe-buffer/-/safe-buffer-5.2.1.tgz#1eaf9fa9bdb1fdd4ec75f58f9cdb4e6b7827eec6"
@@ -11536,27 +11497,6 @@ simple-swizzle@^0.2.2:
dependencies:
is-arrayish "^0.3.1"
-sirv-cli@^0.4.6:
- version "0.4.6"
- resolved "https://registry.yarnpkg.com/sirv-cli/-/sirv-cli-0.4.6.tgz#c28ab20deb3b34637f5a60863dc350f055abca04"
- integrity sha512-/Vj85/kBvPL+n9ibgX6FicLE8VjidC1BhlX67PYPBfbBAphzR6i0k0HtU5c2arejfU3uzq8l3SYPCwl1x7z6Ww==
- dependencies:
- console-clear "^1.1.0"
- get-port "^3.2.0"
- kleur "^3.0.0"
- local-access "^1.0.1"
- sade "^1.4.0"
- sirv "^0.4.6"
- tinydate "^1.0.0"
-
-sirv@^0.4.6:
- version "0.4.6"
- resolved "https://registry.yarnpkg.com/sirv/-/sirv-0.4.6.tgz#185e44eb93d24009dd183b7494285c5180b81f22"
- integrity sha512-rYpOXlNbpHiY4nVXxuDf4mXPvKz1reZGap/LkWp9TvcZ84qD/nPBjjH/6GZsgIjVMbOslnY8YYULAyP8jMn1GQ==
- dependencies:
- "@polka/url" "^0.5.0"
- mime "^2.3.1"
-
sisteransi@^1.0.5:
version "1.0.5"
resolved "https://registry.yarnpkg.com/sisteransi/-/sisteransi-1.0.5.tgz#134d681297756437cc05ca01370d3a7a571075ed"
@@ -12110,13 +12050,6 @@ svelte-apexcharts@^1.0.2:
dependencies:
apexcharts "^3.19.2"
-svelte-flatpickr@^2.4.0:
- version "2.4.0"
- resolved "https://registry.yarnpkg.com/svelte-flatpickr/-/svelte-flatpickr-2.4.0.tgz#190871fc3305956c8c8fd3601cd036b8ac71ef49"
- integrity sha512-UUC5Te+b0qi4POg7VDwfGh0m5W3Hf64OwkfOTj6FEe/dYZN4cBzpQ82EuuQl0CTbbBAsMkcjJcixV1d2V6EHCQ==
- dependencies:
- flatpickr "^4.5.2"
-
svelte-flatpickr@^3.1.0, svelte-flatpickr@^3.2.3:
version "3.2.4"
resolved "https://registry.yarnpkg.com/svelte-flatpickr/-/svelte-flatpickr-3.2.4.tgz#1824e26a5dc151d14906cfc7dfd100aefd1b072d"
@@ -12426,11 +12359,6 @@ tinycolor2@^1.4.1:
resolved "https://registry.yarnpkg.com/tinycolor2/-/tinycolor2-1.4.2.tgz#3f6a4d1071ad07676d7fa472e1fac40a719d8803"
integrity sha512-vJhccZPs965sV/L2sU4oRQVAos0pQXwsvTLkWYdqJ+a8Q5kPFzJTuOFwy7UniPli44NKQGAglksjvOcpo95aZA==
-tinydate@^1.0.0:
- version "1.3.0"
- resolved "https://registry.yarnpkg.com/tinydate/-/tinydate-1.3.0.tgz#e6ca8e5a22b51bb4ea1c3a2a4fd1352dbd4c57fb"
- integrity sha512-7cR8rLy2QhYHpsBDBVYnnWXm8uRTr38RoZakFSW7Bs7PzfMPNZthuMLkwqZv7MTu8lhQ91cOFYS5a7iFj2oR3w==
-
tmp@^0.0.33:
version "0.0.33"
resolved "https://registry.yarnpkg.com/tmp/-/tmp-0.0.33.tgz#6d34335889768d21b2bcda0aa277ced3b1bfadf9"
@@ -12642,13 +12570,6 @@ tunnel@0.0.6:
resolved "https://registry.yarnpkg.com/tunnel/-/tunnel-0.0.6.tgz#72f1314b34a5b192db012324df2cc587ca47f92c"
integrity sha512-1h/Lnq9yajKY2PEbBadPXj3VxsDDu844OnaAo52UVmIzIvwwtBPIuNvkjuzBlTWpfJyUbG3ez0KSBibQkj4ojg==
-turndown@^7.0.0:
- version "7.1.1"
- resolved "https://registry.yarnpkg.com/turndown/-/turndown-7.1.1.tgz#96992f2d9b40a1a03d3ea61ad31b5a5c751ef77f"
- integrity sha512-BEkXaWH7Wh7e9bd2QumhfAXk5g34+6QUmmWx+0q6ThaVOLuLUqsnkq35HQ5SBHSaxjSfSM7US5o4lhJNH7B9MA==
- dependencies:
- domino "^2.1.6"
-
tweetnacl@^0.14.3, tweetnacl@~0.14.0:
version "0.14.5"
resolved "https://registry.yarnpkg.com/tweetnacl/-/tweetnacl-0.14.5.tgz#5ae68177f192d4456269d108afa93ff8743f4f64"
@@ -12701,11 +12622,23 @@ typedarray-to-buffer@^3.1.5:
dependencies:
is-typedarray "^1.0.0"
+typeof-article@^0.1.1:
+ version "0.1.1"
+ resolved "https://registry.yarnpkg.com/typeof-article/-/typeof-article-0.1.1.tgz#9f07e733c3fbb646ffa9e61c08debacd460e06af"
+ integrity sha1-nwfnM8P7tkb/qeYcCN66zUYOBq8=
+ dependencies:
+ kind-of "^3.1.0"
+
typescript@^4.3.5:
version "4.3.5"
resolved "https://registry.yarnpkg.com/typescript/-/typescript-4.3.5.tgz#4d1c37cc16e893973c45a06886b7113234f119f4"
integrity sha512-DqQgihaQ9cUrskJo9kIyW/+g0Vxsk8cDtZ52a3NGh0YNTfpUSArXSohyUGnvbPazEPLu398C0UxmKSOrPumUzA==
+typo-js@*:
+ version "1.2.1"
+ resolved "https://registry.yarnpkg.com/typo-js/-/typo-js-1.2.1.tgz#334a0d8c3f6c56f2f1e15fdf6c31677793cbbe9b"
+ integrity sha512-bTGLjbD3WqZDR3CgEFkyi9Q/SS2oM29ipXrWfDb4M74ea69QwKAECVceYpaBu0GfdnASMg9Qfl67ttB23nePHg==
+
uc.micro@^1.0.1, uc.micro@^1.0.5:
version "1.0.6"
resolved "https://registry.yarnpkg.com/uc.micro/-/uc.micro-1.0.6.tgz#9c411a802a409a91fc6cf74081baba34b24499ac"
diff --git a/packages/string-templates/package.json b/packages/string-templates/package.json
index c30b6d7c4d..a5714c0f89 100644
--- a/packages/string-templates/package.json
+++ b/packages/string-templates/package.json
@@ -1,6 +1,6 @@
{
"name": "@budibase/string-templates",
- "version": "1.0.49-alpha.5",
+ "version": "1.0.50-alpha.6",
"description": "Handlebars wrapper for Budibase templating.",
"main": "src/index.cjs",
"module": "dist/bundle.mjs",
@@ -13,6 +13,11 @@
},
"./package.json": "./package.json"
},
+ "files": [
+ "dist",
+ "src",
+ "manifest.json"
+ ],
"scripts": {
"build": "tsc && rollup -c",
"dev:builder": "tsc && rollup -cw",
diff --git a/packages/string-templates/src/helpers/index.js b/packages/string-templates/src/helpers/index.js
index 6b9195047e..ad4082e3a4 100644
--- a/packages/string-templates/src/helpers/index.js
+++ b/packages/string-templates/src/helpers/index.js
@@ -21,7 +21,7 @@ const HELPERS = [
// javascript helper
new Helper(HelperFunctionNames.JS, processJS, false),
// this help is applied to all statements
- new Helper(HelperFunctionNames.ALL, value => {
+ new Helper(HelperFunctionNames.ALL, (value, { __opts }) => {
if (
value != null &&
typeof value === "object" &&
@@ -36,7 +36,11 @@ const HELPERS = [
if (value && value.string) {
value = value.string
}
- let text = new SafeString(value.replace(/&/g, "&"))
+ let text = value
+ if (__opts && __opts.escapeNewlines) {
+ text = value.replace(/\n/g, "\\n")
+ }
+ text = new SafeString(text.replace(/&/g, "&"))
if (text == null || typeof text !== "string") {
return text
}
@@ -62,10 +66,14 @@ module.exports.HelperNames = () => {
)
}
-module.exports.registerAll = handlebars => {
+module.exports.registerMinimum = handlebars => {
for (let helper of HELPERS) {
helper.register(handlebars)
}
+}
+
+module.exports.registerAll = handlebars => {
+ module.exports.registerMinimum(handlebars)
// register imported helpers
externalHandlebars.registerAll(handlebars)
}
diff --git a/packages/string-templates/src/index.cjs b/packages/string-templates/src/index.cjs
index bc9a410813..5a84c45d78 100644
--- a/packages/string-templates/src/index.cjs
+++ b/packages/string-templates/src/index.cjs
@@ -17,6 +17,7 @@ module.exports.processString = templates.processString
module.exports.processObject = templates.processObject
module.exports.doesContainStrings = templates.doesContainStrings
module.exports.doesContainString = templates.doesContainString
+module.exports.disableEscaping = templates.disableEscaping
/**
* Use vm2 to run JS scripts in a node env
@@ -27,4 +28,4 @@ setJSRunner((js, context) => {
timeout: 1000
})
return vm.run(js)
-})
\ No newline at end of file
+})
diff --git a/packages/string-templates/src/index.js b/packages/string-templates/src/index.js
index 616981995d..7996bb9f1f 100644
--- a/packages/string-templates/src/index.js
+++ b/packages/string-templates/src/index.js
@@ -1,14 +1,15 @@
const handlebars = require("handlebars")
-const { registerAll } = require("./helpers/index")
+const { registerAll, registerMinimum } = require("./helpers/index")
const processors = require("./processors")
const { atob, btoa } = require("./utilities")
const manifest = require("../manifest.json")
-const { FIND_HBS_REGEX } = require("./utilities")
+const { FIND_HBS_REGEX, FIND_DOUBLE_HBS_REGEX } = require("./utilities")
const hbsInstance = handlebars.create()
registerAll(hbsInstance)
const hbsInstanceNoHelpers = handlebars.create()
-const defaultOpts = { noHelpers: false }
+registerMinimum(hbsInstanceNoHelpers)
+const defaultOpts = { noHelpers: false, noEscaping: false }
/**
* utility function to check if the object is valid
@@ -27,7 +28,7 @@ function testObject(object) {
* @param {object|array} object The input structure which is to be recursed, it is important to note that
* if the structure contains any cycles then this will fail.
* @param {object} context The context that handlebars should fill data from.
- * @param {object|null} opts optional - specify some options for processing.
+ * @param {object|undefined} opts optional - specify some options for processing.
* @returns {Promise} The structure input, as fully updated as possible.
*/
module.exports.processObject = async (object, context, opts) => {
@@ -58,7 +59,7 @@ module.exports.processObject = async (object, context, opts) => {
* then nothing will occur.
* @param {string} string The template string which is the filled from the context object.
* @param {object} context An object of information which will be used to enrich the string.
- * @param {object|null} opts optional - specify some options for processing.
+ * @param {object|undefined} opts optional - specify some options for processing.
* @returns {Promise} The enriched string, all templates should have been replaced if they can be.
*/
module.exports.processString = async (string, context, opts) => {
@@ -72,7 +73,7 @@ module.exports.processString = async (string, context, opts) => {
* @param {object|array} object The input structure which is to be recursed, it is important to note that
* if the structure contains any cycles then this will fail.
* @param {object} context The context that handlebars should fill data from.
- * @param {object|null} opts optional - specify some options for processing.
+ * @param {object|undefined} opts optional - specify some options for processing.
* @returns {object|array} The structure input, as fully updated as possible.
*/
module.exports.processObjectSync = (object, context, opts) => {
@@ -93,7 +94,7 @@ module.exports.processObjectSync = (object, context, opts) => {
* then nothing will occur. This is a pure sync call and therefore does not have the full functionality of the async call.
* @param {string} string The template string which is the filled from the context object.
* @param {object} context An object of information which will be used to enrich the string.
- * @param {object|null} opts optional - specify some options for processing.
+ * @param {object|undefined} opts optional - specify some options for processing.
* @returns {string} The enriched string, all templates should have been replaced if they can be.
*/
module.exports.processStringSync = (string, context, opts) => {
@@ -105,26 +106,46 @@ module.exports.processStringSync = (string, context, opts) => {
throw "Cannot process non-string types."
}
try {
- // finalising adds a helper, can't do this with no helpers
- const shouldFinalise = !opts.noHelpers
- string = processors.preprocess(string, shouldFinalise)
+ string = processors.preprocess(string, opts)
// this does not throw an error when template can't be fulfilled, have to try correct beforehand
const instance = opts.noHelpers ? hbsInstanceNoHelpers : hbsInstance
- const template = instance.compile(string, {
+ const templateString =
+ opts && opts.noEscaping ? exports.disableEscaping(string) : string
+ const template = instance.compile(templateString, {
strict: false,
})
const now = Math.floor(Date.now() / 1000) * 1000
return processors.postprocess(
template({
now: new Date(now).toISOString(),
+ __opts: opts,
...context,
- })
+ }),
+ { escapeNewlines: opts ? opts.escapeNewlines : false }
)
} catch (err) {
return input
}
}
+/**
+ * By default with expressions like {{ name }} handlebars will escape various
+ * characters, which can be problematic. To fix this we use the syntax {{{ name }}},
+ * this function will find any double braces and switch to triple.
+ * @param string the string to have double HBS statements converted to triple.
+ */
+module.exports.disableEscaping = string => {
+ let regexp = new RegExp(FIND_DOUBLE_HBS_REGEX)
+ const matches = string.match(regexp)
+ if (matches == null) {
+ return string
+ }
+ for (let match of matches) {
+ string = string.replace(match, `{${match}}`)
+ }
+ return string
+}
+
/**
* Simple utility function which makes sure that a templating property has been wrapped in literal specifiers correctly.
* @param {string} property The property which is to be wrapped.
@@ -156,7 +177,9 @@ module.exports.isValid = (string, opts) => {
const context = {}
try {
const instance = opts.noHelpers ? hbsInstanceNoHelpers : hbsInstance
- instance.compile(processors.preprocess(string, false))(context)
+ instance.compile(processors.preprocess(string, { noFinalise: true }))(
+ context
+ )
return true
} catch (err) {
const msg = err && err.message ? err.message : err
diff --git a/packages/string-templates/src/index.mjs b/packages/string-templates/src/index.mjs
index a592ae26d5..f2cffdf378 100644
--- a/packages/string-templates/src/index.mjs
+++ b/packages/string-templates/src/index.mjs
@@ -17,6 +17,7 @@ export const processString = templates.processString
export const processObject = templates.processObject
export const doesContainStrings = templates.doesContainStrings
export const doesContainString = templates.doesContainString
+export const disableEscaping = templates.disableEscaping
/**
* Use polyfilled vm to run JS scripts in a browser Env
@@ -30,4 +31,4 @@ setJSRunner((js, context) => {
}
vm.createContext(context)
return vm.runInNewContext(js, context, { timeout: 1000 })
-})
\ No newline at end of file
+})
diff --git a/packages/string-templates/src/processors/index.js b/packages/string-templates/src/processors/index.js
index 174041133a..aae18aed8b 100644
--- a/packages/string-templates/src/processors/index.js
+++ b/packages/string-templates/src/processors/index.js
@@ -2,7 +2,7 @@ const { FIND_HBS_REGEX } = require("../utilities")
const preprocessor = require("./preprocessor")
const postprocessor = require("./postprocessor")
-function process(output, processors) {
+function process(output, processors, opts) {
for (let processor of processors) {
// if a literal statement has occurred stop
if (typeof output !== "string") {
@@ -15,24 +15,22 @@ function process(output, processors) {
continue
}
for (let match of matches) {
- output = processor.process(output, match)
+ output = processor.process(output, match, opts)
}
}
return output
}
-module.exports.preprocess = (string, finalise = true) => {
+module.exports.preprocess = (string, opts) => {
let processors = preprocessor.processors
- // the pre-processor finalisation stops handlebars from ever throwing an error
- // might want to pre-process for other benefits but still want to see errors
- if (!finalise) {
+ if (opts.noFinalise) {
processors = processors.filter(
processor => processor.name !== preprocessor.PreprocessorNames.FINALISE
)
}
+ return process(string, processors, opts)
+}
+module.exports.postprocess = string => {
+ let processors = postprocessor.processors
return process(string, processors)
}
-
-module.exports.postprocess = string => {
- return process(string, postprocessor.processors)
-}
diff --git a/packages/string-templates/src/processors/postprocessor.js b/packages/string-templates/src/processors/postprocessor.js
index 7fc3f663fe..f78a572d07 100644
--- a/packages/string-templates/src/processors/postprocessor.js
+++ b/packages/string-templates/src/processors/postprocessor.js
@@ -16,6 +16,8 @@ class Postprocessor {
}
}
+module.exports.PostProcessorNames = PostProcessorNames
+
module.exports.processors = [
new Postprocessor(PostProcessorNames.CONVERT_LITERALS, statement => {
if (typeof statement !== "string" || !statement.includes(LITERAL_MARKER)) {
diff --git a/packages/string-templates/src/processors/preprocessor.js b/packages/string-templates/src/processors/preprocessor.js
index 6f6537674a..4b296d0fc7 100644
--- a/packages/string-templates/src/processors/preprocessor.js
+++ b/packages/string-templates/src/processors/preprocessor.js
@@ -16,8 +16,8 @@ class Preprocessor {
this.fn = fn
}
- process(fullString, statement) {
- const output = this.fn(statement)
+ process(fullString, statement, opts) {
+ const output = this.fn(statement, opts)
const idx = fullString.indexOf(statement)
return swapStrings(fullString, idx, statement.length, output)
}
@@ -48,7 +48,8 @@ module.exports.processors = [
return statement
}),
- new Preprocessor(PreprocessorNames.FINALISE, statement => {
+ new Preprocessor(PreprocessorNames.FINALISE, (statement, opts) => {
+ const noHelpers = opts && opts.noHelpers
let insideStatement = statement.slice(2, statement.length - 2)
if (insideStatement.charAt(0) === " ") {
insideStatement = insideStatement.slice(1)
@@ -63,7 +64,10 @@ module.exports.processors = [
return statement
}
}
- if (HelperNames().some(option => option.includes(possibleHelper))) {
+ if (
+ !noHelpers &&
+ HelperNames().some(option => option.includes(possibleHelper))
+ ) {
insideStatement = `(${insideStatement})`
}
return `{{ all ${insideStatement} }}`
diff --git a/packages/string-templates/src/utilities.js b/packages/string-templates/src/utilities.js
index 645aca78ba..9704f84ecc 100644
--- a/packages/string-templates/src/utilities.js
+++ b/packages/string-templates/src/utilities.js
@@ -1,6 +1,7 @@
const ALPHA_NUMERIC_REGEX = /^[A-Za-z0-9]+$/g
module.exports.FIND_HBS_REGEX = /{{([^{].*?)}}/g
+module.exports.FIND_DOUBLE_HBS_REGEX = /(? {
return char.match(ALPHA_NUMERIC_REGEX)
diff --git a/packages/string-templates/test/basic.spec.js b/packages/string-templates/test/basic.spec.js
index 490c0aa514..c5aac2a628 100644
--- a/packages/string-templates/test/basic.spec.js
+++ b/packages/string-templates/test/basic.spec.js
@@ -6,6 +6,7 @@ const {
getManifest,
encodeJSBinding,
doesContainString,
+ disableEscaping,
} = require("../src/index.cjs")
describe("Test that the string processing works correctly", () => {
@@ -176,3 +177,22 @@ describe("check does contain string function", () => {
expect(doesContainString(js, "foo")).toEqual(true)
})
})
+
+describe("check that disabling escaping function works", () => {
+ it("should work for a single statement", () => {
+ expect(disableEscaping("{{ name }}")).toEqual("{{{ name }}}")
+ })
+
+ it("should work for two statements", () => {
+ expect(disableEscaping("{{ name }} welcome to {{ platform }}")).toEqual("{{{ name }}} welcome to {{{ platform }}}")
+ })
+
+ it("shouldn't convert triple braces", () => {
+ expect(disableEscaping("{{{ name }}}")).toEqual("{{{ name }}}")
+ })
+
+ it("should work with a combination", () => {
+ expect(disableEscaping("{{ name }} welcome to {{{ platform }}}")).toEqual("{{{ name }}} welcome to {{{ platform }}}")
+ })
+})
+
diff --git a/packages/string-templates/test/escapes.spec.js b/packages/string-templates/test/escapes.spec.js
index 7e55b66b88..b845fddec9 100644
--- a/packages/string-templates/test/escapes.spec.js
+++ b/packages/string-templates/test/escapes.spec.js
@@ -59,3 +59,33 @@ describe("attempt some complex problems", () => {
expect(output).toBe("nulltest")
})
})
+
+describe("check behaviour with newlines", () => {
+ const context = {
+ binding: `Hello
+ there`
+ }
+ it("should escape new line to \\n with double brace", async () => {
+ const hbs = JSON.stringify({
+ body: "{{ binding }}"
+ })
+ const output = await processString(hbs, context, { escapeNewlines: true })
+ expect(JSON.parse(output).body).toBe(context.binding)
+ })
+
+ it("should work the same with triple brace", async () => {
+ const hbs = JSON.stringify({
+ body: "{{{ binding }}}"
+ })
+ const output = await processString(hbs, context, { escapeNewlines: true })
+ expect(JSON.parse(output).body).toBe(context.binding)
+ })
+
+ it("should still work with helpers disabled", async () => {
+ const hbs = JSON.stringify({
+ body: "{{ binding }}"
+ })
+ const output = await processString(hbs, context, { escapeNewlines: true, noHelpers: true })
+ expect(JSON.parse(output).body).toBe(context.binding)
+ })
+})
diff --git a/packages/string-templates/test/helpers.spec.js b/packages/string-templates/test/helpers.spec.js
index b4179475fb..0d39660d59 100644
--- a/packages/string-templates/test/helpers.spec.js
+++ b/packages/string-templates/test/helpers.spec.js
@@ -20,7 +20,7 @@ describe("test that it can run without helpers", () => {
)
const valid = await processString("{{ avg 1 1 1 }}", {})
expect(valid).toBe("1")
- expect(output).toBe("{{ avg 1 1 1 }}")
+ expect(output).toBe("")
})
})
diff --git a/packages/worker/package.json b/packages/worker/package.json
index c85d3087b1..699b4abdf6 100644
--- a/packages/worker/package.json
+++ b/packages/worker/package.json
@@ -1,7 +1,7 @@
{
"name": "@budibase/worker",
"email": "hi@budibase.com",
- "version": "1.0.49-alpha.5",
+ "version": "1.0.50-alpha.6",
"description": "Budibase background service",
"main": "src/index.ts",
"repository": {
@@ -34,8 +34,8 @@
"author": "Budibase",
"license": "GPL-3.0",
"dependencies": {
- "@budibase/backend-core": "^1.0.49-alpha.5",
- "@budibase/string-templates": "^1.0.49-alpha.5",
+ "@budibase/backend-core": "^1.0.50-alpha.6",
+ "@budibase/string-templates": "^1.0.50-alpha.6",
"@koa/router": "^8.0.0",
"@sentry/node": "^6.0.0",
"@techpass/passport-openidconnect": "^0.3.0",
diff --git a/packages/worker/src/api/controllers/global/configs.js b/packages/worker/src/api/controllers/global/configs.js
index fc0aa868a3..604e7d0e93 100644
--- a/packages/worker/src/api/controllers/global/configs.js
+++ b/packages/worker/src/api/controllers/global/configs.js
@@ -11,7 +11,6 @@ const {
upload,
ObjectStoreBuckets,
} = require("@budibase/backend-core/objectStore")
-const CouchDB = require("../../../db")
const { getGlobalDB, getTenantId } = require("@budibase/backend-core/tenancy")
const env = require("../../../environment")
const { googleCallbackUrl, oidcCallbackUrl } = require("./auth")
@@ -252,7 +251,7 @@ exports.configChecklist = async function (ctx) {
// TODO: Watch get started video
// Apps exist
- const apps = await getAllApps(CouchDB, { idsOnly: true })
+ const apps = await getAllApps({ idsOnly: true })
// They have set up SMTP
const smtpConfig = await getScopedFullConfig(db, {
diff --git a/packages/worker/src/api/controllers/global/roles.js b/packages/worker/src/api/controllers/global/roles.js
index 3c977a6290..96de0e4753 100644
--- a/packages/worker/src/api/controllers/global/roles.js
+++ b/packages/worker/src/api/controllers/global/roles.js
@@ -1,15 +1,15 @@
const { getAllRoles } = require("@budibase/backend-core/roles")
const {
getAllApps,
- getDeployedAppID,
+ getProdAppID,
DocumentTypes,
} = require("@budibase/backend-core/db")
-const CouchDB = require("../../../db")
+const { doInAppContext, getAppDB } = require("@budibase/backend-core/context")
exports.fetch = async ctx => {
const tenantId = ctx.user.tenantId
// always use the dev apps as they'll be most up to date (true)
- const apps = await getAllApps(CouchDB, { tenantId, all: true })
+ const apps = await getAllApps({ tenantId, all: true })
const promises = []
for (let app of apps) {
// use dev app IDs
@@ -18,7 +18,7 @@ exports.fetch = async ctx => {
const roles = await Promise.all(promises)
const response = {}
for (let app of apps) {
- const deployedAppId = getDeployedAppID(app.appId)
+ const deployedAppId = getProdAppID(app.appId)
response[deployedAppId] = {
roles: roles.shift(),
name: app.name,
@@ -31,12 +31,14 @@ exports.fetch = async ctx => {
exports.find = async ctx => {
const appId = ctx.params.appId
- const db = new CouchDB(appId)
- const app = await db.get(DocumentTypes.APP_METADATA)
- ctx.body = {
- roles: await getAllRoles(appId),
- name: app.name,
- version: app.version,
- url: app.url,
- }
+ await doInAppContext(appId, async () => {
+ const db = getAppDB()
+ const app = await db.get(DocumentTypes.APP_METADATA)
+ ctx.body = {
+ roles: await getAllRoles(),
+ name: app.name,
+ version: app.version,
+ url: app.url,
+ }
+ })
}