Skip to content

Commit 687a9bb

Browse files
authored
Allow plain objects for TokenData (#391)
1 parent a4a8552 commit 687a9bb

File tree

3 files changed

+55
-15
lines changed

3 files changed

+55
-15
lines changed

Readme.md

Lines changed: 19 additions & 13 deletions
Original file line numberDiff line numberDiff line change
@@ -66,7 +66,7 @@ fn("/users/123/delete");
6666

6767
The `match` function returns a function for matching strings against a path:
6868

69-
- **path** String or array of strings.
69+
- **path** String, `TokenData` object, or array of strings and `TokenData` objects.
7070
- **options** _(optional)_ (Extends [pathToRegexp](#pathToRegexp) options)
7171
- **decode** Function for decoding strings to params, or `false` to disable all processing. (default: `decodeURIComponent`)
7272

@@ -80,7 +80,7 @@ const fn = match("/foo/:bar");
8080

8181
The `pathToRegexp` function returns the `regexp` for matching strings against paths, and an array of `keys` for understanding the `RegExp#exec` matches.
8282

83-
- **path** String or array of strings.
83+
- **path** String, `TokenData` object, or array of strings and `TokenData` objects.
8484
- **options** _(optional)_ (See [parse](#parse) for more options)
8585
- **sensitive** Regexp will be case sensitive. (default: `false`)
8686
- **end** Validate the match reaches the end of the string. (default: `true`)
@@ -97,7 +97,7 @@ regexp.exec("/foo/123"); //=> ["/foo/123", "123"]
9797

9898
The `compile` function will return a function for transforming parameters into a valid path:
9999

100-
- **path** A string.
100+
- **path** A string or `TokenData` object.
101101
- **options** (See [parse](#parse) for more options)
102102
- **delimiter** The default delimiter for segments, e.g. `[^/]` for `:named` parameters. (default: `'/'`)
103103
- **encode** Function for encoding input strings for output into the path, or `false` to disable entirely. (default: `encodeURIComponent`)
@@ -121,15 +121,17 @@ toPathRaw({ id: "%3A%2F" }); //=> "/user/%3A%2F"
121121

122122
## Stringify
123123

124-
Transform `TokenData` (a sequence of tokens) back into a Path-to-RegExp string.
124+
Transform a `TokenData` object to a Path-to-RegExp string.
125125

126-
- **data** A `TokenData` instance
126+
- **data** A `TokenData` object.
127127

128128
```js
129-
const data = new TokenData([
130-
{ type: "text", value: "/" },
131-
{ type: "param", name: "foo" },
132-
]);
129+
const data = {
130+
tokens: [
131+
{ type: "text", value: "/" },
132+
{ type: "param", name: "foo" },
133+
],
134+
};
133135

134136
const path = stringify(data); //=> "/:foo"
135137
```
@@ -149,20 +151,24 @@ The `parse` function accepts a string and returns `TokenData`, which can be used
149151

150152
### Tokens
151153

152-
`TokenData` is a sequence of tokens, currently of types `text`, `parameter`, `wildcard`, or `group`.
154+
`TokenData` has two properties:
155+
156+
- **tokens** A sequence of tokens, currently of types `text`, `parameter`, `wildcard`, or `group`.
157+
- **originalPath** The original path used with `parse`, shown in error messages to assist debugging.
153158

154159
### Custom path
155160

156-
In some applications, you may not be able to use the `path-to-regexp` syntax, but still want to use this library for `match` and `compile`. For example:
161+
In some applications you may not be able to use the `path-to-regexp` syntax, but you still want to use this library for `match` and `compile`. For example:
157162

158163
```js
159-
import { TokenData, match } from "path-to-regexp";
164+
import { match } from "path-to-regexp";
160165

161166
const tokens = [
162167
{ type: "text", value: "/" },
163168
{ type: "parameter", name: "foo" },
164169
];
165-
const path = new TokenData(tokens);
170+
const originalPath = "/[foo]"; // To help debug error messages.
171+
const path = { tokens, originalPath };
166172
const fn = match(path);
167173

168174
fn("/test"); //=> { path: '/test', index: 0, params: { foo: 'test' } }

src/cases.spec.ts

Lines changed: 34 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -221,6 +221,16 @@ export const STRINGIFY_TESTS: StringifyTestSet[] = [
221221
]),
222222
expected: "\\\\:test",
223223
},
224+
{
225+
data: {
226+
tokens: [
227+
{ type: "text", value: "/" },
228+
{ type: "param", name: "test" },
229+
],
230+
originalPath: "/:test",
231+
},
232+
expected: "/:test",
233+
},
224234
];
225235

226236
export const COMPILE_TESTS: CompileTestSet[] = [
@@ -323,6 +333,15 @@ export const COMPILE_TESTS: CompileTestSet[] = [
323333
{ input: { test: "123/xyz" }, expected: "/123/xyz" },
324334
],
325335
},
336+
{
337+
path: {
338+
tokens: [
339+
{ type: "text", value: "/" },
340+
{ type: "param", name: "test" },
341+
],
342+
},
343+
tests: [{ input: { test: "123" }, expected: "/123" }],
344+
},
326345
];
327346

328347
/**
@@ -1678,4 +1697,19 @@ export const MATCH_TESTS: MatchTestSet[] = [
16781697
},
16791698
],
16801699
},
1700+
1701+
/**
1702+
* Token data.
1703+
*/
1704+
{
1705+
path: {
1706+
tokens: [
1707+
{ type: "text", value: "/" },
1708+
{ type: "param", name: "test" },
1709+
],
1710+
},
1711+
tests: [
1712+
{ input: "/123", expected: { path: "/123", params: { test: "123" } } },
1713+
],
1714+
},
16811715
];

src/index.ts

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -334,7 +334,7 @@ export function compile<P extends ParamData = ParamData>(
334334
) {
335335
const { encode = encodeURIComponent, delimiter = DEFAULT_DELIMITER } =
336336
options;
337-
const data = path instanceof TokenData ? path : parse(path, options);
337+
const data = typeof path === "object" ? path : parse(path, options);
338338
const fn = tokensToFunction(data.tokens, delimiter, encode);
339339

340340
return function path(params: P = {} as P) {
@@ -504,7 +504,7 @@ export function pathToRegexp(
504504
const sources: string[] = [];
505505

506506
for (const input of pathsToArray(path, [])) {
507-
const data = input instanceof TokenData ? input : parse(input, options);
507+
const data = typeof input === "object" ? input : parse(input, options);
508508
for (const tokens of flatten(data.tokens, 0, [])) {
509509
sources.push(toRegExp(tokens, delimiter, keys, data.originalPath));
510510
}

0 commit comments

Comments
 (0)