Compare commits
No commits in common. "ab799fef5b52ad55539b0d60db96a3b67e7a7a59" and "83cd4d71190b5f1669cee6fafbb71ee2360e6d0d" have entirely different histories.
ab799fef5b
...
83cd4d7119
24 changed files with 206 additions and 685 deletions
95
README.md
95
README.md
|
|
@ -1,6 +1,4 @@
|
|||
# upload-generator (ARCHIVE)
|
||||
|
||||
> This project has been superseded by my current web gallery build system. It won't receive any more updates.
|
||||
# upload-generator
|
||||
|
||||
Script to generate multi-gallery upload-ready files.
|
||||
|
||||
|
|
@ -9,32 +7,9 @@ Script to generate multi-gallery upload-ready files.
|
|||
- A Python environment to install dependencies (`pip install -r requirements.txt`); if unsure, create a fresh one with `virtualenv venv`.
|
||||
- LibreOffice 6.0+, making sure that `libreoffice` is in your PATH.
|
||||
|
||||
## Installation
|
||||
|
||||
I recommend creating a virtualenv first. Linux/macOS/Unix example:
|
||||
|
||||
```sh
|
||||
virtualenv venv
|
||||
source venv/bin/activate # Also run every time you use this tool
|
||||
pip install -r requirements.txt
|
||||
activate-global-python-argcomplete
|
||||
```
|
||||
|
||||
Windows example (autocompletion is not available):
|
||||
|
||||
```powershell
|
||||
virtualenv venv
|
||||
.\venv\Scripts\activate # Also run every time you use this tool
|
||||
pip install -r requirements.txt
|
||||
```
|
||||
|
||||
## Testing
|
||||
|
||||
Run `python test.py`.
|
||||
|
||||
## Usage
|
||||
|
||||
Run with `python main.py -h` (or simply `./main.py -h`) for options. Generated files are output to `./out` by default.
|
||||
Run with `python main.py -h` for options. Generated files are output to `./out` by default.
|
||||
|
||||
### Story files
|
||||
|
||||
|
|
@ -54,69 +29,43 @@ In order to parse descriptions, you need a configuration file (default path is `
|
|||
}
|
||||
```
|
||||
|
||||
Uppercase letters for usernames are optional. Only include your username for websites that you wish to generate descriptions/stories for.
|
||||
|
||||
#### Basic formatting
|
||||
Uppercase letters are optional. Only include your username for websites that you wish to generate descriptions for.
|
||||
|
||||
Input descriptions should be formatted as BBCode. The following tags are accepted:
|
||||
|
||||
```bbcode
|
||||
[b]Bold text[/b]
|
||||
[i]Italic text[/i]
|
||||
[u]Underline text[/u]
|
||||
[center]Center-aligned text[/center]
|
||||
[url=https://github.com/BadMannersXYZ]URL link[/url]
|
||||
[url=https://github.com]URL link[/url]
|
||||
```
|
||||
|
||||
#### Self-link formatting
|
||||
|
||||
`[self][/self]` will create a link to yourself for each website, with the same formatting as the `[user]...[/user]` switch. The inside of this tag must be always empty.
|
||||
|
||||
#### Conditional formatting
|
||||
|
||||
Another special set of tags is `[if=...][/if]` or `[if=...][/if][else][/else]`. The `if` tag lets you conditionally show content . The `else` tag is optional but must appear immediately after an `if` tag (no whitespace in-between), and displays whenever the condition is false instead.
|
||||
|
||||
The following parameters are available:
|
||||
|
||||
- `site`: generated according to the target website, eg. `[if=site==fa]...[/if]` or `[if=site!=furaffinity]...[/if][else]...[/else]`
|
||||
- `define`: generated according to argument(s) defined to the script into the command line (i.e. with the `-D / --define-option` flag), eg. `[if=define==prod]...[/if][else]...[/else]` or `[if=define in possible_flag_1,possible_flag_2]...[/if][else]...[/else]`
|
||||
|
||||
The following conditions are available:
|
||||
|
||||
- `==`: eg. `[if=site==eka]Only show this on Eka's Portal.[/if][else]Show this everywhere except Eka's Portal![/else]`
|
||||
- `!=`: eg. `[if=site!=eka]Show this everywhere except Eka's Portal![/if]`
|
||||
- ` in `: eg. `[if=site in eka,fa]Only show this on Eka's Portal or Fur Affinity...[/if]`
|
||||
|
||||
#### Switch formatting
|
||||
|
||||
You can use special switch tags, which will generate different information per website automatically. There are two options available: creating different URLs per website, or linking to different users.
|
||||
There are also special tags to link to yourself or other users automatically. This may include websites not available in the configuration:
|
||||
|
||||
```bbcode
|
||||
Available for both [user]...[/user] and [siteurl]...[/siteurl] tags
|
||||
- [generic=https://example.com/GenericUser]Generic text to display[/generic]
|
||||
- [eka=EkasPortalUser][/eka] [aryion=EkasPortalUser][/aryion]
|
||||
- [fa=FurAffinityUser][/fa] [furaffinity=FurAffinityUser][/furaffinity]
|
||||
- [weasyl=WeasylUser][/weasyl]
|
||||
- [ib=InkbunnyUser][/ib] [inkbunnny=InkbunnyUser][/inkbunnny]
|
||||
- [sf=SoFurryUser][/sf] [sofurry=SoFurryUser][/sofurry]
|
||||
[self][/self]
|
||||
|
||||
Available only for [user]...[/user]
|
||||
- [twitter=@TwitterUser][/twitter] - Leading '@' is optional
|
||||
- [mastodon=@MastodonUser@mastodoninstance.com][/mastodon] - Leading '@' is optional
|
||||
[eka]EkasPortalUser[/eka]
|
||||
[fa]FurAffinityUser[/fa]
|
||||
[weasyl]WeasylUser[/weasyl]
|
||||
[ib]InkbunnyUser[/ib]
|
||||
[sf]SoFurryUser[/sf]
|
||||
[twitter]@TwitterUser[/twitter] - Leading '@' is optional
|
||||
[mastodon]@MastodonUser@mastodoninstance.com[/mastodon] - Leading '@' is optional
|
||||
```
|
||||
|
||||
These tags are nestable and flexible, requiring attributes to display information differently on each supported website. Some examples:
|
||||
`[self][/self]` tags must always be empty. The other tags are nestable and flexible, allowing attributes to display information differently on each supported website. Some examples:
|
||||
|
||||
```bbcode
|
||||
[user][eka]Lorem[/eka][/user] is equivalent to [user][eka=Lorem][/eka][/user].
|
||||
[eka=Lorem][/eka] is equivalent to [eka]Lorem[/eka].
|
||||
|
||||
[user][fa=Ipsum]Dolor[/fa][/user] shows Ipsum's username on Fur Affinity, and "Dolor" everywhere else with a link to Ipsum's userpage on FA.
|
||||
[fa=Ipsum]Dolor[/fa] shows Ipsum's username on FurAffinity, and Dolor everywhere else as a link to Ipsum's FA userpage.
|
||||
|
||||
[user][ib=Sit][weasyl=Amet][twitter=Consectetur][/twitter][/weasyl][/ib][/user] will show a different usernames on Inkbunny and Weasyl. For other websites, the innermost user name and link are prioritized - Twitter, in this case.
|
||||
[user][ib=Sit][twitter=Consectetur][weasyl=Amet][/weasyl][/twitter][/ib][/user] is similar, but the Weasyl user data is prioritized for websites other than Inkbunny. In this case, the Twitter tag is rendered useless, since descriptions can't be generated for the website.
|
||||
[weasyl=Sit][ib=Amet][/ib][/weasyl] will show the two user links on Weasyl and Inkbunny as expected. For other websites, the innermost tag is prioritized - Inkbunny, in this case.
|
||||
[ib=Amet][weasyl=Sit][/weasyl][/ib] is the same as above, but the Weasyl link is prioritized instead.
|
||||
|
||||
[siteurl][sf=https://a.com][eka=https://b.com]Adipiscing[/eka][/sf][/siteurl] displays links on SoFurry and Eka's Portal, with "Adipiscing" as the link's text. Other websites won't display any link.
|
||||
[siteurl][sf=https://a.com][eka=https://b.com][generic=https://c.com]Adipiscing[/generic][/eka][/sf][/siteurl] is the same as above, but with the innermost generic tag serving as a fallback, guaranteeing that a link will be generated for all websites.
|
||||
[ib=Amet][weasyl=Sit]Consectetur[/weasyl][/ib] is the same as above, but Consectetur is displayed as the username for websites other than Inkbunny and Weasyl, with a link to the Weasyl gallery.
|
||||
|
||||
[user][fa=Elit][generic=https://github.com/BadMannersXYZ]Bad Manners[/generic][/fa][/user] shows how a generic tag can be used for user links as well, displayed everywhere aside from Fur Affinity in this example. User tags don't need an explicit fallback - the innermost tag is always used as a fallback for user links.
|
||||
[generic=https://github.com/BadMannersXYZ]Bad Manners[/generic] can be used as the innermost tag with a mandatory URL attribute and default username, and is similar to the URL tag, but it can be nested within other profile links. Those other profile links get used only at their respective websites.
|
||||
```
|
||||
|
||||
Another special set of tags is `[if][/if]` and `[else][/else]`. The if tag receives a parameter for the condition (i.e. `[if=parameter==value]...[/if]` or `[if=parameter!=value]...[/if]`) to check on the current transformer, and lets you show or omit generated content respectively. The else tag is optional but must appear immediately after an if tag (no whitespace in-between), and displays whenever the condition is false instead. For now, the if tag only accepts the `site` parameter (eg. `[if=site==fa]...[/if][else]...[/else]` or `[if=site!=furaffinity]...[/if]`).
|
||||
|
|
|
|||
410
description.py
410
description.py
|
|
@ -3,18 +3,12 @@ import io
|
|||
import json
|
||||
import lark
|
||||
import os
|
||||
import psutil
|
||||
import re
|
||||
import subprocess
|
||||
import typing
|
||||
|
||||
from sites import SUPPORTED_SITE_TAGS
|
||||
|
||||
SUPPORTED_USER_TAGS: typing.Mapping[str, typing.Set[str]] = {
|
||||
**SUPPORTED_SITE_TAGS,
|
||||
'twitter': {'twitter'},
|
||||
'mastodon': {'mastodon'},
|
||||
}
|
||||
SUPPORTED_USER_TAGS = ['eka', 'fa', 'weasyl', 'ib', 'sf', 'twitter', 'mastodon']
|
||||
|
||||
DESCRIPTION_GRAMMAR = r"""
|
||||
?start: document_list
|
||||
|
|
@ -24,58 +18,39 @@ DESCRIPTION_GRAMMAR = r"""
|
|||
document: b_tag
|
||||
| i_tag
|
||||
| u_tag
|
||||
| center_tag
|
||||
| url_tag
|
||||
| self_tag
|
||||
| if_tag
|
||||
| user_tag_root
|
||||
| siteurl_tag_root
|
||||
| TEXT
|
||||
|
||||
b_tag: "[b]" [document_list] "[/b]"
|
||||
i_tag: "[i]" [document_list] "[/i]"
|
||||
u_tag: "[u]" [document_list] "[/u]"
|
||||
center_tag: "[center]" [document_list] "[/center]"
|
||||
url_tag: "[url" ["=" [URL]] "]" [document_list] "[/url]"
|
||||
|
||||
self_tag: "[self][/self]"
|
||||
if_tag: "[if=" CONDITION "]" [document_list] "[/if]" [ "[else]" [document_list] "[/else]" ]
|
||||
if_tag: "[if=" CONDITION "]" [document_list] "[/if]" [ "[else]" document_list "[/else]" ]
|
||||
|
||||
user_tag_root: "[user]" user_tag "[/user]"
|
||||
user_tag: user_tag_generic | """
|
||||
user_tag_root: user_tag
|
||||
user_tag: generic_tag | """
|
||||
|
||||
DESCRIPTION_GRAMMAR += ' | '.join(f'user_tag_{tag}' for tag in SUPPORTED_USER_TAGS)
|
||||
for tag, alts in SUPPORTED_USER_TAGS.items():
|
||||
DESCRIPTION_GRAMMAR += f'\n user_tag_{tag}: '
|
||||
DESCRIPTION_GRAMMAR += ' | '.join(f'"[{alt}" ["=" USERNAME] "]" USERNAME "[/{alt}]" | "[{alt}" "=" USERNAME "]" [user_tag] "[/{alt}]"' for alt in alts)
|
||||
DESCRIPTION_GRAMMAR += ' | '.join(f'{tag}_tag' for tag in SUPPORTED_USER_TAGS)
|
||||
DESCRIPTION_GRAMMAR += ''.join(f'\n {tag}_tag: "[{tag}" ["=" USERNAME] "]" USERNAME "[/{tag}]" | "[{tag}" "=" USERNAME "]" [user_tag] "[/{tag}]"' for tag in SUPPORTED_USER_TAGS)
|
||||
|
||||
DESCRIPTION_GRAMMAR += r"""
|
||||
user_tag_generic: "[generic=" URL "]" USERNAME "[/generic]"
|
||||
generic_tag: "[generic=" URL "]" USERNAME "[/generic]"
|
||||
|
||||
siteurl_tag_root: "[siteurl]" siteurl_tag "[/siteurl]"
|
||||
siteurl_tag: siteurl_tag_generic | """
|
||||
|
||||
DESCRIPTION_GRAMMAR += ' | '.join(f'siteurl_tag_{tag}' for tag in SUPPORTED_SITE_TAGS)
|
||||
for tag, alts in SUPPORTED_SITE_TAGS.items():
|
||||
DESCRIPTION_GRAMMAR += f'\n siteurl_tag_{tag}: '
|
||||
DESCRIPTION_GRAMMAR += ' | '.join(f'"[{alt}" "=" URL "]" ( siteurl_tag | TEXT ) "[/{alt}]"' for alt in alts)
|
||||
|
||||
DESCRIPTION_GRAMMAR += r"""
|
||||
siteurl_tag_generic: "[generic=" URL "]" TEXT "[/generic]"
|
||||
|
||||
USERNAME: / *@?[a-zA-Z0-9][a-zA-Z0-9 @._-]*/
|
||||
URL: / *(https?:\/\/)?[^\]]+ */
|
||||
USERNAME: /[a-zA-Z0-9][a-zA-Z0-9 _-]*/
|
||||
URL: /(https?:\/\/)?[^\]]+/
|
||||
TEXT: /([^\[]|[ \t\r\n])+/
|
||||
CONDITION: / *[a-z]+ *(==|!=) *[a-zA-Z0-9_-]+ *| *[a-z]+ +in +([a-zA-Z0-9_-]+ *, *)*[a-zA-Z0-9_-]+ */
|
||||
CONDITION: / *[a-z]+ *(==|!=) *[a-zA-Z0-9]+ */
|
||||
"""
|
||||
|
||||
DESCRIPTION_PARSER = lark.Lark(DESCRIPTION_GRAMMAR, parser='lalr')
|
||||
|
||||
|
||||
class DescriptionParsingError(ValueError):
|
||||
pass
|
||||
|
||||
class SiteSwitchTag:
|
||||
class UserTag:
|
||||
def __init__(self, default: typing.Optional[str]=None, **kwargs):
|
||||
self.default = default
|
||||
self._sites: typing.OrderedDict[str, typing.Optional[str]] = OrderedDict()
|
||||
|
|
@ -95,53 +70,30 @@ class SiteSwitchTag:
|
|||
def __getitem__(self, name: str) -> typing.Optional[str]:
|
||||
return self._sites.get(name)
|
||||
|
||||
def __contains__(self, name: str) -> bool:
|
||||
return name in self._sites
|
||||
|
||||
@property
|
||||
def sites(self):
|
||||
yield from self._sites
|
||||
|
||||
class UploadTransformer(lark.Transformer):
|
||||
def __init__(self, define_options=set(), *args, **kwargs):
|
||||
super().__init__(*args, **kwargs)
|
||||
self.define_options = define_options
|
||||
# Init user_tag_xxxx methods
|
||||
def __init__(self, *args, **kwargs):
|
||||
super(UploadTransformer, self).__init__(*args, **kwargs)
|
||||
def _user_tag_factory(tag):
|
||||
# Create a new user SiteSwitchTag if innermost node, or append to list in order
|
||||
# Create a new UserTag if innermost node, or append to list in order
|
||||
def user_tag(data):
|
||||
attribute, inner = data[0], data[1]
|
||||
if attribute and attribute.strip():
|
||||
if isinstance(inner, SiteSwitchTag):
|
||||
if isinstance(inner, UserTag):
|
||||
inner[tag] = attribute.strip()
|
||||
return inner
|
||||
user = SiteSwitchTag(default=inner and inner.strip())
|
||||
user = UserTag(default=inner and inner.strip())
|
||||
user[tag] = attribute.strip()
|
||||
return user
|
||||
user = SiteSwitchTag()
|
||||
user = UserTag()
|
||||
user[tag] = inner.strip()
|
||||
return user
|
||||
return user_tag
|
||||
for tag in SUPPORTED_USER_TAGS:
|
||||
setattr(self, f'user_tag_{tag}', _user_tag_factory(tag))
|
||||
# Init siteurl_tag_xxxx methods
|
||||
def _siteurl_tag_factory(tag):
|
||||
# Create a new siteurl SiteSwitchTag if innermost node, or append to list in order
|
||||
def siteurl_tag(data):
|
||||
attribute, inner = data[0], data[1]
|
||||
if attribute and attribute.strip():
|
||||
if isinstance(inner, SiteSwitchTag):
|
||||
inner[tag] = attribute.strip()
|
||||
return inner
|
||||
siteurl = SiteSwitchTag(default=inner and inner.strip())
|
||||
siteurl[tag] = attribute.strip()
|
||||
return siteurl
|
||||
siteurl = SiteSwitchTag()
|
||||
siteurl[tag] = inner.strip()
|
||||
return siteurl
|
||||
return siteurl_tag
|
||||
for tag in SUPPORTED_SITE_TAGS:
|
||||
setattr(self, f'siteurl_tag_{tag}', _siteurl_tag_factory(tag))
|
||||
setattr(self, f'{tag}_tag', _user_tag_factory(tag))
|
||||
|
||||
def document_list(self, data):
|
||||
return ''.join(data)
|
||||
|
|
@ -158,9 +110,6 @@ class UploadTransformer(lark.Transformer):
|
|||
def u_tag(self, _):
|
||||
raise NotImplementedError('UploadTransformer.u_tag is abstract')
|
||||
|
||||
def center_tag(self, _):
|
||||
raise NotImplementedError('UploadTransformer.center_tag is abstract')
|
||||
|
||||
def url_tag(self, _):
|
||||
raise NotImplementedError('UploadTransformer.url_tag is abstract')
|
||||
|
||||
|
|
@ -170,86 +119,57 @@ class UploadTransformer(lark.Transformer):
|
|||
def transformer_matches_site(self, site: str) -> bool:
|
||||
raise NotImplementedError('UploadTransformer.transformer_matches_site is abstract')
|
||||
|
||||
def transformer_matches_define(self, option: str) -> bool:
|
||||
return option in self.define_options
|
||||
|
||||
def if_tag(self, data: typing.Tuple[str, str, str]):
|
||||
condition, truthy_document, falsy_document = data[0], data[1], data[2]
|
||||
# Test equality condition, i.e. `site==foo`
|
||||
equality_condition = condition.split('==', 1)
|
||||
condition, truthy_document, falsy_document = data
|
||||
equality_condition = condition.split('==')
|
||||
if len(equality_condition) == 2 and equality_condition[1].strip():
|
||||
conditional_test = f'transformer_matches_{equality_condition[0].strip()}'
|
||||
if hasattr(self, conditional_test):
|
||||
if getattr(self, conditional_test)(equality_condition[1].strip()):
|
||||
return truthy_document or ''
|
||||
return falsy_document or ''
|
||||
# Test inequality condition, i.e. `site!=foo`
|
||||
inequality_condition = condition.split('!=', 1)
|
||||
inequality_condition = condition.split('!=')
|
||||
if len(inequality_condition) == 2 and inequality_condition[1].strip():
|
||||
conditional_test = f'transformer_matches_{inequality_condition[0].strip()}'
|
||||
if hasattr(self, conditional_test):
|
||||
if not getattr(self, conditional_test)(inequality_condition[1].strip()):
|
||||
return truthy_document or ''
|
||||
return falsy_document or ''
|
||||
# Test inclusion condition, i.e. `site in foo,bar`
|
||||
inclusion_condition = condition.split(' in ', 1)
|
||||
if len(inclusion_condition) == 2 and inclusion_condition[1].strip():
|
||||
conditional_test = f'transformer_matches_{inclusion_condition[0].strip()}'
|
||||
if hasattr(self, conditional_test):
|
||||
matches = (parameter.strip() for parameter in inclusion_condition[1].split(','))
|
||||
if any(getattr(self, conditional_test)(match) for match in matches):
|
||||
return truthy_document or ''
|
||||
return falsy_document or ''
|
||||
raise ValueError(f'Invalid [if][/if] tag condition: {condition}')
|
||||
|
||||
def user_tag_root(self, data):
|
||||
user_data: SiteSwitchTag = data[0]
|
||||
user_data: UserTag = data[0]
|
||||
for site in user_data.sites:
|
||||
if site == 'generic':
|
||||
return self.url_tag((user_data['generic'], user_data.default))
|
||||
elif site == 'aryion':
|
||||
return self.url_tag((f'https://aryion.com/g4/user/{user_data["aryion"]}', user_data.default or user_data["aryion"]))
|
||||
elif site == 'furaffinity':
|
||||
return self.url_tag((f'https://furaffinity.net/user/{user_data["furaffinity"].replace("_", "")}', user_data.default or user_data['furaffinity']))
|
||||
return self.url_tag((user_data['generic'].strip(), user_data.default))
|
||||
elif site == 'eka':
|
||||
return self.url_tag((f'https://aryion.com/g4/user/{user_data["eka"]}', user_data.default or user_data["eka"]))
|
||||
elif site == 'fa':
|
||||
return self.url_tag((f'https://furaffinity.net/user/{user_data["fa"].replace("_", "")}', user_data.default or user_data['fa']))
|
||||
elif site == 'weasyl':
|
||||
return self.url_tag((f'https://www.weasyl.com/~{user_data["weasyl"].replace(" ", "").lower()}', user_data.default or user_data['weasyl']))
|
||||
elif site == 'inkbunny':
|
||||
return self.url_tag((f'https://inkbunny.net/{user_data["inkbunny"]}', user_data.default or user_data['inkbunny']))
|
||||
elif site == 'sofurry':
|
||||
return self.url_tag((f'https://{user_data["sofurry"].replace(" ", "-").lower()}.sofurry.com', user_data.default or user_data['sofurry']))
|
||||
elif site == 'ib':
|
||||
return self.url_tag((f'https://inkbunny.net/{user_data["ib"]}', user_data.default or user_data['ib']))
|
||||
elif site == 'sf':
|
||||
return self.url_tag((f'https://{user_data["sf"].replace(" ", "-").lower()}.sofurry.com', user_data.default or user_data['sf']))
|
||||
elif site == 'twitter':
|
||||
return self.url_tag((f'https://twitter.com/{user_data["twitter"].rsplit("@", 1)[-1]}', user_data.default or user_data['twitter']))
|
||||
elif site == 'mastodon':
|
||||
*_, mastodon_user, mastodon_instance = user_data["mastodon"].rsplit('@', 2)
|
||||
return self.url_tag((f'https://{mastodon_instance.strip()}/@{mastodon_user.strip()}', user_data.default or user_data['mastodon']))
|
||||
return self.url_tag((f'https://{mastodon_instance}/@{mastodon_user}', user_data.default or user_data['mastodon']))
|
||||
else:
|
||||
print(f'Unknown site "{site}" found in user tag; ignoring...')
|
||||
raise TypeError('Invalid user SiteSwitchTag data - no matches found')
|
||||
raise TypeError('Invalid UserTag data')
|
||||
|
||||
def user_tag(self, data):
|
||||
return data[0]
|
||||
|
||||
def user_tag_generic(self, data):
|
||||
def generic_tag(self, data):
|
||||
attribute, inner = data[0], data[1]
|
||||
user = SiteSwitchTag(default=inner.strip())
|
||||
user = UserTag(default=inner.strip())
|
||||
user['generic'] = attribute.strip()
|
||||
return user
|
||||
|
||||
def siteurl_tag_root(self, data):
|
||||
siteurl_data: SiteSwitchTag = data[0]
|
||||
if 'generic' in siteurl_data:
|
||||
return self.url_tag((siteurl_data['generic'], siteurl_data.default))
|
||||
return ''
|
||||
|
||||
def siteurl_tag(self, data):
|
||||
return data[0]
|
||||
|
||||
def siteurl_tag_generic(self, data):
|
||||
attribute, inner = data[0], data[1]
|
||||
siteurl = SiteSwitchTag(default=inner.strip())
|
||||
siteurl['generic'] = attribute.strip()
|
||||
return siteurl
|
||||
|
||||
class BbcodeTransformer(UploadTransformer):
|
||||
def b_tag(self, data):
|
||||
if data[0] is None or not data[0].strip():
|
||||
|
|
@ -266,15 +186,8 @@ class BbcodeTransformer(UploadTransformer):
|
|||
return ''
|
||||
return f'[u]{data[0]}[/u]'
|
||||
|
||||
def center_tag(self, data):
|
||||
if data[0] is None or not data[0].strip():
|
||||
return ''
|
||||
return f'[center]{data[0]}[/center]'
|
||||
|
||||
def url_tag(self, data):
|
||||
if data[0] is None or not data[0].strip():
|
||||
return data[1].strip() if data[1] else ''
|
||||
return f'[url={data[0].strip()}]{data[1] if data[1] and data[1].strip() else data[0].strip()}[/url]'
|
||||
return f'[url={data[0] or ""}]{data[1] or ""}[/url]'
|
||||
|
||||
class MarkdownTransformer(UploadTransformer):
|
||||
def b_tag(self, data):
|
||||
|
|
@ -293,9 +206,7 @@ class MarkdownTransformer(UploadTransformer):
|
|||
return f'<u>{data[0]}</u>' # Markdown should support simple HTML tags
|
||||
|
||||
def url_tag(self, data):
|
||||
if data[0] is None or not data[0].strip():
|
||||
return data[1].strip() if data[1] else ''
|
||||
return f'[{data[1] if data[1] and data[1].strip() else data[0].strip()}]({data[0].strip()})'
|
||||
return f'[{data[1] or ""}]({data[0] or ""})'
|
||||
|
||||
class PlaintextTransformer(UploadTransformer):
|
||||
def b_tag(self, data):
|
||||
|
|
@ -307,209 +218,140 @@ class PlaintextTransformer(UploadTransformer):
|
|||
def u_tag(self, data):
|
||||
return str(data[0]) if data[0] else ''
|
||||
|
||||
def center_tag(self, data):
|
||||
return str(data[0]) if data[0] else ''
|
||||
|
||||
def url_tag(self, data):
|
||||
if data[0] is None or not data[0].strip():
|
||||
return data[1] if data[1] and data[1].strip() else ''
|
||||
if data[1] is None or not data[1].strip():
|
||||
return data[0].strip()
|
||||
return f'{data[1]}: {data[0].strip()}'
|
||||
return str(data[0]) if data[0] else ''
|
||||
return f'{data[1].strip()}: {data[0] or ""}'
|
||||
|
||||
def user_tag_root(self, data):
|
||||
user_data = data[0]
|
||||
for site in user_data.sites:
|
||||
if site == 'generic':
|
||||
break
|
||||
elif site == 'aryion':
|
||||
return f'{user_data["aryion"]} on Eka\'s Portal'
|
||||
elif site == 'furaffinity':
|
||||
return f'{user_data["furaffinity"]} on Fur Affinity'
|
||||
elif site == 'eka':
|
||||
return f'{user_data["eka"]} on Eka\'s Portal'
|
||||
elif site == 'fa':
|
||||
return f'{user_data["fa"]} on Fur Affinity'
|
||||
elif site == 'weasyl':
|
||||
return f'{user_data["weasyl"]} on Weasyl'
|
||||
elif site == 'inkbunny':
|
||||
return f'{user_data["inkbunny"]} on Inkbunny'
|
||||
elif site == 'sofurry':
|
||||
return f'{user_data["sofurry"]} on SoFurry'
|
||||
elif site == 'ib':
|
||||
return f'{user_data["ib"]} on Inkbunny'
|
||||
elif site == 'sf':
|
||||
return f'{user_data["sf"]} on SoFurry'
|
||||
elif site == 'twitter':
|
||||
return f'@{user_data["twitter"].rsplit("@", 1)[-1]} on Twitter'
|
||||
elif site == 'mastodon':
|
||||
*_, mastodon_user, mastodon_instance = user_data["mastodon"].rsplit('@', 2)
|
||||
return f'@{mastodon_user.strip()} on {mastodon_instance.strip()}'
|
||||
return f'@{mastodon_user} on {mastodon_instance}'
|
||||
else:
|
||||
print(f'Unknown site "{site}" found in user tag; ignoring...')
|
||||
return super().user_tag_root(data)
|
||||
return super(PlaintextTransformer, self).user_tag_root(data)
|
||||
|
||||
class AryionTransformer(BbcodeTransformer):
|
||||
def __init__(self, self_user=None, *args, **kwargs):
|
||||
super().__init__(*args, **kwargs)
|
||||
def __init__(self, self_user, *args, **kwargs):
|
||||
super(AryionTransformer, self).__init__(*args, **kwargs)
|
||||
def self_tag(data):
|
||||
if self_user:
|
||||
return self.user_tag_root((SiteSwitchTag(aryion=self_user),))
|
||||
raise ValueError('self_tag is unavailable for AryionTransformer - no user provided')
|
||||
return self.user_tag_root((UserTag(eka=self_user),))
|
||||
self.self_tag = self_tag
|
||||
|
||||
@staticmethod
|
||||
def transformer_matches_site(site: str) -> bool:
|
||||
return site in SUPPORTED_USER_TAGS['aryion']
|
||||
def transformer_matches_site(self, site: str) -> bool:
|
||||
return site in ('eka', 'aryion')
|
||||
|
||||
def user_tag_root(self, data):
|
||||
user_data: SiteSwitchTag = data[0]
|
||||
if user_data['aryion']:
|
||||
return f':icon{user_data["aryion"]}:'
|
||||
return super().user_tag_root(data)
|
||||
|
||||
def siteurl_tag_root(self, data):
|
||||
siteurl_data: SiteSwitchTag = data[0]
|
||||
if 'aryion' in siteurl_data:
|
||||
return self.url_tag((siteurl_data['aryion'], siteurl_data.default))
|
||||
return super().siteurl_tag_root(data)
|
||||
user_data = data[0]
|
||||
if user_data['eka']:
|
||||
return f':icon{user_data["eka"]}:'
|
||||
return super(AryionTransformer, self).user_tag_root(data)
|
||||
|
||||
class FuraffinityTransformer(BbcodeTransformer):
|
||||
def __init__(self, self_user=None, *args, **kwargs):
|
||||
super().__init__(*args, **kwargs)
|
||||
def __init__(self, self_user, *args, **kwargs):
|
||||
super(FuraffinityTransformer, self).__init__(*args, **kwargs)
|
||||
def self_tag(data):
|
||||
if self_user:
|
||||
return self.user_tag_root((SiteSwitchTag(furaffinity=self_user),))
|
||||
raise ValueError('self_tag is unavailable for FuraffinityTransformer - no user provided')
|
||||
return self.user_tag_root((UserTag(fa=self_user),))
|
||||
self.self_tag = self_tag
|
||||
|
||||
@staticmethod
|
||||
def transformer_matches_site(site: str) -> bool:
|
||||
return site in SUPPORTED_USER_TAGS['furaffinity']
|
||||
def transformer_matches_site(self, site: str) -> bool:
|
||||
return site in ('fa', 'furaffinity')
|
||||
|
||||
def user_tag_root(self, data):
|
||||
user_data: SiteSwitchTag = data[0]
|
||||
if user_data['furaffinity']:
|
||||
return f':icon{user_data["furaffinity"]}:'
|
||||
return super().user_tag_root(data)
|
||||
|
||||
def siteurl_tag_root(self, data):
|
||||
siteurl_data: SiteSwitchTag = data[0]
|
||||
if 'furaffinity' in siteurl_data:
|
||||
return self.url_tag((siteurl_data['furaffinity'], siteurl_data.default))
|
||||
return super().siteurl_tag_root(data)
|
||||
user_data = data[0]
|
||||
if user_data['fa']:
|
||||
return f':icon{user_data["fa"]}:'
|
||||
return super(FuraffinityTransformer, self).user_tag_root(data)
|
||||
|
||||
class WeasylTransformer(MarkdownTransformer):
|
||||
def __init__(self, self_user=None, *args, **kwargs):
|
||||
super().__init__(*args, **kwargs)
|
||||
def __init__(self, self_user, *args, **kwargs):
|
||||
super(WeasylTransformer, self).__init__(*args, **kwargs)
|
||||
def self_tag(data):
|
||||
if self_user:
|
||||
return self.user_tag_root((SiteSwitchTag(weasyl=self_user),))
|
||||
raise ValueError('self_tag is unavailable for WeasylTransformer - no user provided')
|
||||
return self.user_tag_root((UserTag(weasyl=self_user),))
|
||||
self.self_tag = self_tag
|
||||
|
||||
@staticmethod
|
||||
def transformer_matches_site(site: str) -> bool:
|
||||
def transformer_matches_site(self, site: str) -> bool:
|
||||
return site == 'weasyl'
|
||||
|
||||
def center_tag(self, data):
|
||||
if data[0] is None or not data[0].strip():
|
||||
return ''
|
||||
return f'<div class="align-center">{data[0]}</div>'
|
||||
|
||||
def user_tag_root(self, data):
|
||||
user_data: SiteSwitchTag = data[0]
|
||||
user_data = data[0]
|
||||
if user_data['weasyl']:
|
||||
return f'<!~{user_data["weasyl"].replace(" ", "")}>'
|
||||
if user_data.default is None:
|
||||
for site in user_data.sites:
|
||||
if site == 'furaffinity':
|
||||
return f'<fa:{user_data["furaffinity"]}>'
|
||||
if site == 'inkbunny':
|
||||
return f'<ib:{user_data["inkbunny"]}>'
|
||||
if site == 'sofurry':
|
||||
return f'<sf:{user_data["sofurry"]}>'
|
||||
return super().user_tag_root(data)
|
||||
|
||||
def siteurl_tag_root(self, data):
|
||||
siteurl_data: SiteSwitchTag = data[0]
|
||||
if 'weasyl' in siteurl_data:
|
||||
return self.url_tag((siteurl_data['weasyl'], siteurl_data.default))
|
||||
return super().siteurl_tag_root(data)
|
||||
if site == 'fa':
|
||||
return f'<fa:{user_data["fa"]}>'
|
||||
if site == 'ib':
|
||||
return f'<ib:{user_data["ib"]}>'
|
||||
if site == 'sf':
|
||||
return f'<sf:{user_data["sf"]}>'
|
||||
return super(WeasylTransformer, self).user_tag_root(data)
|
||||
|
||||
class InkbunnyTransformer(BbcodeTransformer):
|
||||
def __init__(self, self_user=None, *args, **kwargs):
|
||||
super().__init__(*args, **kwargs)
|
||||
def __init__(self, self_user, *args, **kwargs):
|
||||
super(InkbunnyTransformer, self).__init__(*args, **kwargs)
|
||||
def self_tag(data):
|
||||
if self_user:
|
||||
return self.user_tag_root((SiteSwitchTag(inkbunny=self_user),))
|
||||
raise ValueError('self_tag is unavailable for InkbunnyTransformer - no user provided')
|
||||
return self.user_tag_root((UserTag(ib=self_user),))
|
||||
self.self_tag = self_tag
|
||||
|
||||
@staticmethod
|
||||
def transformer_matches_site(site: str) -> bool:
|
||||
return site in SUPPORTED_USER_TAGS['inkbunny']
|
||||
def transformer_matches_site(self, site: str) -> bool:
|
||||
return site in ('ib', 'inkbunny')
|
||||
|
||||
def user_tag_root(self, data):
|
||||
user_data: SiteSwitchTag = data[0]
|
||||
if user_data['inkbunny']:
|
||||
return f'[iconname]{user_data["inkbunny"]}[/iconname]'
|
||||
user_data = data[0]
|
||||
if user_data['ib']:
|
||||
return f'[iconname]{user_data["ib"]}[/iconname]'
|
||||
if user_data.default is None:
|
||||
for site in user_data.sites:
|
||||
if site == 'furaffinity':
|
||||
return f'[fa]{user_data["furaffinity"]}[/fa]'
|
||||
if site == 'sofurry':
|
||||
return f'[sf]{user_data["sofurry"]}[/sf]'
|
||||
if site == 'fa':
|
||||
return f'[fa]{user_data["fa"]}[/fa]'
|
||||
if site == 'sf':
|
||||
return f'[sf]{user_data["sf"]}[/sf]'
|
||||
if site == 'weasyl':
|
||||
return f'[weasyl]{user_data["weasyl"].replace(" ", "").lower()}[/weasyl]'
|
||||
return super().user_tag_root(data)
|
||||
|
||||
def siteurl_tag_root(self, data):
|
||||
siteurl_data: SiteSwitchTag = data[0]
|
||||
if 'inkbunny' in siteurl_data:
|
||||
return self.url_tag((siteurl_data['inkbunny'], siteurl_data.default))
|
||||
return super().siteurl_tag_root(data)
|
||||
return super(InkbunnyTransformer, self).user_tag_root(data)
|
||||
|
||||
class SoFurryTransformer(BbcodeTransformer):
|
||||
def __init__(self, self_user=None, *args, **kwargs):
|
||||
super().__init__(*args, **kwargs)
|
||||
def __init__(self, self_user, *args, **kwargs):
|
||||
super(SoFurryTransformer, self).__init__(*args, **kwargs)
|
||||
def self_tag(data):
|
||||
if self_user:
|
||||
return self.user_tag_root((SiteSwitchTag(sofurry=self_user),))
|
||||
raise ValueError('self_tag is unavailable for SoFurryTransformer - no user provided')
|
||||
return self.user_tag_root((UserTag(sf=self_user),))
|
||||
self.self_tag = self_tag
|
||||
|
||||
@staticmethod
|
||||
def transformer_matches_site(site: str) -> bool:
|
||||
return site in SUPPORTED_USER_TAGS['sofurry']
|
||||
def transformer_matches_site(self, site: str) -> bool:
|
||||
return site in ('sf', 'sofurry')
|
||||
|
||||
def user_tag_root(self, data):
|
||||
user_data: SiteSwitchTag = data[0]
|
||||
if user_data['sofurry']:
|
||||
return f':icon{user_data["sofurry"]}:'
|
||||
user_data = data[0]
|
||||
if user_data['sf']:
|
||||
return f':icon{user_data["sf"]}:'
|
||||
if user_data.default is None:
|
||||
for site in user_data.sites:
|
||||
if site == 'furaffinity':
|
||||
return f'fa!{user_data["furaffinity"]}'
|
||||
if site == 'inkbunny':
|
||||
return f'ib!{user_data["inkbunny"]}'
|
||||
return super().user_tag_root(data)
|
||||
|
||||
def siteurl_tag_root(self, data):
|
||||
siteurl_data: SiteSwitchTag = data[0]
|
||||
if 'sofurry' in siteurl_data:
|
||||
return self.url_tag((siteurl_data['sofurry'], siteurl_data.default))
|
||||
return super().siteurl_tag_root(data)
|
||||
if site == 'fa':
|
||||
return f'fa!{user_data["fa"]}'
|
||||
if site == 'ib':
|
||||
return f'ib!{user_data["ib"]}'
|
||||
return super(SoFurryTransformer, self).user_tag_root(data)
|
||||
|
||||
|
||||
def validate_parsed_tree(parsed_tree):
|
||||
for node in parsed_tree.iter_subtrees_topdown():
|
||||
if node.data in {'b_tag', 'i_tag', 'u_tag', 'url_tag'}:
|
||||
node_type = str(node.data)
|
||||
for node2 in node.find_data(node_type):
|
||||
if node != node2:
|
||||
raise DescriptionParsingError(f'Invalid nested {node_type} on line {node2.data.line} column {node2.data.column}')
|
||||
|
||||
def parse_description(description_path, config, out_dir, ignore_empty_files=False, define_options=set()):
|
||||
for proc in psutil.process_iter(['cmdline']):
|
||||
if proc.info['cmdline'] and 'libreoffice' in proc.info['cmdline'][0] and '--writer' in proc.info['cmdline'][1:]:
|
||||
if ignore_empty_files:
|
||||
print('WARN: LibreOffice Writer appears to be running. This command may output empty files until it is closed.')
|
||||
break
|
||||
print('WARN: LibreOffice Writer appears to be running. This command may raise an error until it is closed.')
|
||||
break
|
||||
|
||||
description = ''
|
||||
with subprocess.Popen(('libreoffice', '--cat', description_path), stdout=subprocess.PIPE) as ps:
|
||||
def parse_description(description_path, config_path, out_dir, ignore_empty_files=False):
|
||||
ps = subprocess.Popen(('libreoffice', '--cat', description_path), stdout=subprocess.PIPE)
|
||||
description = '\n'.join(line.strip() for line in io.TextIOWrapper(ps.stdout, encoding='utf-8-sig'))
|
||||
if not description or re.match(r'^\s+$', description):
|
||||
error = f'Description processing returned empty file: libreoffice --cat {description_path}'
|
||||
|
|
@ -518,21 +360,7 @@ def parse_description(description_path, config, out_dir, ignore_empty_files=Fals
|
|||
else:
|
||||
raise RuntimeError(error)
|
||||
|
||||
try:
|
||||
parsed_description = DESCRIPTION_PARSER.parse(description)
|
||||
except lark.UnexpectedInput as e:
|
||||
input_error = e.match_examples(DESCRIPTION_PARSER.parse, {
|
||||
'Unclosed tag': ['[b]text', '[i]text', '[u]text', '[url]text'],
|
||||
'Unopened tag': ['text[/b]', 'text[/i]', 'text[/u]', 'text[/url]'],
|
||||
'Unknown tag': ['[invalid]text[/invalid]'],
|
||||
'Missing tag brackets': ['b]text[/b]', '[btext[/b]', '[b]text/b]', '[b]text[/b', 'i]text[/i]', '[itext[/i]', '[i]text/i]', '[i]text[/i', 'u]text[/u]', '[utext[/u]', '[u]text/u]', '[u]text[/u'],
|
||||
'Missing tag slash': ['[b]text[b]', '[i]text[i]', '[u]text[u]'],
|
||||
'Empty switch tag': ['[user][/user]', '[siteurl][/siteurl]'],
|
||||
'Empty user tag': ['[user][aryion][/aryion][/user]', '[user][furaffinity][/furaffinity][/user]', '[user][inkbunny][/inkbunny][/user]', '[user][sofurry][/sofurry][/user]', '[user][weasyl][/weasyl][/user]', '[user][twitter][/twitter][/user]', '[user][mastodon][/mastodon][/user]', '[user][aryion=][/aryion][/user]', '[user][furaffinity=][/furaffinity][/user]', '[user][inkbunny=][/inkbunny][/user]', '[user][sofurry=][/sofurry][/user]', '[user][weasyl=][/weasyl][/user]', '[user][twitter=][/twitter][/user]', '[user][mastodon=][/mastodon][/user]'],
|
||||
'Empty siteurl tag': ['[siteurl][aryion][/aryion][/siteurl]', '[siteurl][furaffinity][/furaffinity][/siteurl]', '[siteurl][inkbunny][/inkbunny][/siteurl]', '[siteurl][sofurry][/sofurry][/siteurl]', '[siteurl][weasyl][/weasyl][/siteurl]' '[siteurl][aryion=][/aryion][/siteurl]', '[siteurl][furaffinity=][/furaffinity][/siteurl]', '[siteurl][inkbunny=][/inkbunny][/siteurl]', '[siteurl][sofurry=][/sofurry][/siteurl]', '[siteurl][weasyl=][/weasyl][/siteurl]'],
|
||||
})
|
||||
raise DescriptionParsingError(f'Unable to parse description. {input_error or "Unknown grammar error"} in line {e.line} column {e.column}:\n{e.get_context(description)}') from e
|
||||
validate_parsed_tree(parsed_description)
|
||||
transformations = {
|
||||
'aryion': ('desc_aryion.txt', AryionTransformer),
|
||||
'furaffinity': ('desc_furaffinity.txt', FuraffinityTransformer),
|
||||
|
|
@ -540,9 +368,13 @@ def parse_description(description_path, config, out_dir, ignore_empty_files=Fals
|
|||
'sofurry': ('desc_sofurry.txt', SoFurryTransformer),
|
||||
'weasyl': ('desc_weasyl.md', WeasylTransformer),
|
||||
}
|
||||
# assert all(k in SUPPORTED_SITE_TAGS for k in transformations)
|
||||
with open(config_path, 'r') as f:
|
||||
config = json.load(f)
|
||||
# Validate JSON
|
||||
errors = []
|
||||
if type(config) is not dict:
|
||||
errors.append(ValueError('Configuration must be a JSON object'))
|
||||
else:
|
||||
for (website, username) in config.items():
|
||||
if website not in transformations:
|
||||
errors.append(ValueError(f'Website \'{website}\' is unsupported'))
|
||||
|
|
@ -550,19 +382,17 @@ def parse_description(description_path, config, out_dir, ignore_empty_files=Fals
|
|||
errors.append(ValueError(f'Website \'{website}\' has invalid username \'{json.dumps(username)}\''))
|
||||
elif username.strip() == '':
|
||||
errors.append(ValueError(f'Website \'{website}\' has empty username'))
|
||||
if not any(ws in config for ws in transformations):
|
||||
if not any(ws in config for ws in ('aryion', 'furaffinity', 'weasyl', 'inkbunny', 'sofurry')):
|
||||
errors.append(ValueError('No valid websites found'))
|
||||
if errors:
|
||||
raise ExceptionGroup('Invalid configuration for description parsing', errors)
|
||||
# Create descriptions
|
||||
RE_MULTIPLE_EMPTY_LINES = re.compile(r'\n\n+')
|
||||
re_multiple_empty_lines = re.compile(r'\n\n+')
|
||||
for (website, username) in config.items():
|
||||
(filepath, transformer) = transformations[website]
|
||||
with open(os.path.join(out_dir, filepath), 'w') as f:
|
||||
if description.strip():
|
||||
transformed_description = transformer(self_user=username, define_options=define_options).transform(parsed_description)
|
||||
cleaned_description = RE_MULTIPLE_EMPTY_LINES.sub('\n\n', transformed_description).strip()
|
||||
if cleaned_description:
|
||||
f.write(cleaned_description)
|
||||
f.write('\n')
|
||||
transformed_description = transformer(username).transform(parsed_description)
|
||||
f.write(re_multiple_empty_lines.sub('\n\n', transformed_description))
|
||||
else:
|
||||
f.write('')
|
||||
|
|
|
|||
79
main.py
Executable file → Normal file
79
main.py
Executable file → Normal file
|
|
@ -1,47 +1,18 @@
|
|||
#!/usr/bin/env python
|
||||
# PYTHON_ARGCOMPLETE_OK
|
||||
import argcomplete
|
||||
from argcomplete.completers import FilesCompleter, DirectoriesCompleter
|
||||
import argparse
|
||||
import json
|
||||
import os
|
||||
import re
|
||||
from subprocess import CalledProcessError
|
||||
import shutil
|
||||
import tempfile
|
||||
|
||||
from description import parse_description
|
||||
from story import parse_story
|
||||
from sites import INVERSE_SUPPORTED_SITE_TAGS
|
||||
|
||||
|
||||
def main(out_dir_path=None, story_path=None, description_path=None, file_paths=[], config_path=None, keep_out_dir=False, ignore_empty_files=False, define_options=[]):
|
||||
def main(out_dir_path=None, story_path=None, description_path=None, file_path=None, config_path=None, keep_out_dir=False, ignore_empty_files=False):
|
||||
if not out_dir_path:
|
||||
raise ValueError('Missing out_dir_path')
|
||||
if not config_path:
|
||||
raise ValueError('Missing config_path')
|
||||
if not file_paths:
|
||||
file_paths = []
|
||||
if not define_options:
|
||||
define_options = []
|
||||
config = None
|
||||
if story_path or description_path:
|
||||
with open(config_path, 'r') as f:
|
||||
config_json = json.load(f)
|
||||
if type(config_json) is not dict:
|
||||
raise ValueError('The configuration file must contain a valid JSON object')
|
||||
config = {}
|
||||
for k, v in config_json.items():
|
||||
if type(v) is not str:
|
||||
raise ValueError(f'Invalid configuration value for entry "{k}": expected string, got {type(v)}')
|
||||
new_k = INVERSE_SUPPORTED_SITE_TAGS.get(k)
|
||||
if not new_k:
|
||||
print(f'Ignoring unknown configuration key "{k}"...')
|
||||
if new_k in config:
|
||||
raise ValueError(f'Duplicate configuration entry for website "{new_key}": found collision with key "{k}"')
|
||||
config[new_k] = v
|
||||
if len(config) == 0:
|
||||
raise ValueError(f'Invalid configuration file "{config_path}": no valid sites defined')
|
||||
remove_out_dir = not keep_out_dir and os.path.isdir(out_dir_path)
|
||||
with tempfile.TemporaryDirectory() as tdir:
|
||||
# Clear output dir if it exists and shouldn't be kept
|
||||
|
|
@ -53,17 +24,14 @@ def main(out_dir_path=None, story_path=None, description_path=None, file_paths=[
|
|||
try:
|
||||
# Convert original file to .rtf (Aryion) and .txt (all others)
|
||||
if story_path:
|
||||
parse_story(story_path, config, out_dir_path, tdir, ignore_empty_files)
|
||||
parse_story(story_path, config_path, out_dir_path, tdir, ignore_empty_files)
|
||||
|
||||
# Parse FA description and convert for each website
|
||||
if description_path:
|
||||
define_options_set = set(define_options)
|
||||
if len(define_options_set) < len(define_options):
|
||||
print('WARNING: duplicated entries defined with -D / --define-option')
|
||||
parse_description(description_path, config, out_dir_path, ignore_empty_files, define_options)
|
||||
parse_description(description_path, config_path, out_dir_path, ignore_empty_files)
|
||||
|
||||
# Copy generic files over to output
|
||||
for file_path in file_paths:
|
||||
# Copy generic file over to output
|
||||
if file_path:
|
||||
shutil.copy(file_path, out_dir_path)
|
||||
|
||||
except CalledProcessError as e:
|
||||
|
|
@ -84,41 +52,32 @@ def main(out_dir_path=None, story_path=None, description_path=None, file_paths=[
|
|||
if __name__ == '__main__':
|
||||
parser = argparse.ArgumentParser(description='generate multi-gallery upload-ready files')
|
||||
parser.add_argument('-o', '--output-dir', dest='out_dir_path', default='./out',
|
||||
help='path of output directory').completer = DirectoriesCompleter
|
||||
help='path of output directory')
|
||||
parser.add_argument('-c', '--config', dest='config_path', default='./config.json',
|
||||
help='path of JSON configuration file').completer = FilesCompleter
|
||||
parser.add_argument('-D', '--define-option', dest='define_options', action='append',
|
||||
help='options to define as a truthy value when parsing descriptions')
|
||||
help='path of JSON configuration file')
|
||||
parser.add_argument('-s', '--story', dest='story_path',
|
||||
help='path of LibreOffice-readable story file').completer = FilesCompleter
|
||||
help='path of LibreOffice-readable story file')
|
||||
parser.add_argument('-d', '--description', dest='description_path',
|
||||
help='path of BBCode-formatted description file').completer = FilesCompleter
|
||||
parser.add_argument('-f', '--file', dest='file_paths', action='append',
|
||||
help='path(s) of generic file(s) to include in output (i.e. an image or thumbnail)').completer = FilesCompleter
|
||||
help='path of BBCode-formatted description file')
|
||||
parser.add_argument('-f', '--file', dest='file_path',
|
||||
help='path of generic file to include in output (i.e. an image or thumbnail)')
|
||||
parser.add_argument('-k', '--keep-out-dir', dest='keep_out_dir', action='store_true',
|
||||
help='whether output directory contents should be kept.\nif set, a script error may leave partial files behind')
|
||||
parser.add_argument('-I', '--ignore-empty-files', dest='ignore_empty_files', action='store_true',
|
||||
help='do not raise an error if any input file is empty or whitespace-only')
|
||||
argcomplete.autocomplete(parser)
|
||||
args = parser.parse_args()
|
||||
|
||||
file_paths = args.file_paths or []
|
||||
if not (args.story_path or args.description_path or any(file_paths)):
|
||||
parser.error('at least one of ( --story | --description | --file ) must be set')
|
||||
if not any([args.story_path, args.description_path]):
|
||||
parser.error('at least one of ( --story | --description ) must be set')
|
||||
if args.out_dir_path and os.path.exists(args.out_dir_path) and not os.path.isdir(args.out_dir_path):
|
||||
parser.error(f'--output-dir {args.out_dir_path} must be an existing directory or inexistent; found a file instead')
|
||||
parser.error('--output-dir must be an existing directory or inexistent')
|
||||
if args.story_path and not os.path.isfile(args.story_path):
|
||||
parser.error(f'--story {args.story_path} is not a valid file')
|
||||
parser.error('--story must be a valid file')
|
||||
if args.description_path and not os.path.isfile(args.description_path):
|
||||
parser.error(f'--description {args.description_path} is not a valid file')
|
||||
for file_path in file_paths:
|
||||
if not os.path.isfile(file_path):
|
||||
parser.error(f'--file {file_path} is not a valid file')
|
||||
if (args.story_path or args.description_path) and args.config_path and not os.path.isfile(args.config_path):
|
||||
parser.error('--description must be a valid file')
|
||||
if args.file_path and not os.path.isfile(args.file_path):
|
||||
parser.error('--file must be a valid file')
|
||||
if args.config_path and not os.path.isfile(args.config_path):
|
||||
parser.error('--config must be a valid file')
|
||||
if args.define_options:
|
||||
for option in args.define_options:
|
||||
if not re.match(r'^[a-zA-Z0-9_-]+$', option):
|
||||
parser.error(f'--define-option {option} is not a valid option; it must only contain alphanumeric characters, dashes, or underlines')
|
||||
|
||||
main(**vars(args))
|
||||
|
|
|
|||
|
|
@ -1,4 +1 @@
|
|||
argcomplete==3.2.1
|
||||
lark==1.1.8
|
||||
parameterized==0.9.0
|
||||
psutil==5.9.6
|
||||
lark==1.1.5
|
||||
|
|
|
|||
13
sites.py
13
sites.py
|
|
@ -1,13 +0,0 @@
|
|||
import itertools
|
||||
import typing
|
||||
|
||||
SUPPORTED_SITE_TAGS: typing.Mapping[str, typing.Set[str]] = {
|
||||
'aryion': {'aryion', 'eka', 'eka_portal'},
|
||||
'furaffinity': {'furaffinity', 'fa'},
|
||||
'weasyl': {'weasyl'},
|
||||
'inkbunny': {'inkbunny', 'ib'},
|
||||
'sofurry': {'sofurry', 'sf'},
|
||||
}
|
||||
|
||||
INVERSE_SUPPORTED_SITE_TAGS: typing.Mapping[str, str] = \
|
||||
dict(itertools.chain.from_iterable(zip(v, itertools.repeat(k)) for (k, v) in SUPPORTED_SITE_TAGS.items()))
|
||||
37
story.py
37
story.py
|
|
@ -1,7 +1,6 @@
|
|||
import io
|
||||
import json
|
||||
import os
|
||||
import psutil
|
||||
import re
|
||||
import subprocess
|
||||
|
||||
|
|
@ -17,57 +16,43 @@ def get_rtf_styles(rtf_source: str):
|
|||
rtf_styles[style_name] = rtf_style
|
||||
return rtf_styles
|
||||
|
||||
def parse_story(story_path, config, out_dir, temp_dir, ignore_empty_files=False):
|
||||
should_create_txt_story = any(ws in config for ws in ('furaffinity', 'inkbunny', 'sofurry'))
|
||||
should_create_md_story = any(ws in config for ws in ('weasyl',))
|
||||
def parse_story(story_path, config_path, out_dir, temp_dir, ignore_empty_files=False):
|
||||
with open(config_path, 'r') as f:
|
||||
config = json.load(f)
|
||||
if type(config) is not dict:
|
||||
raise ValueError('Invalid configuration for story parsing: Configuration must be a JSON object')
|
||||
should_create_txt_story = any(ws in config for ws in ('furaffinity', 'weasyl', 'inkbunny', 'sofurry'))
|
||||
should_create_rtf_story = any(ws in config for ws in ('aryion',))
|
||||
if not (should_create_txt_story or should_create_md_story or should_create_rtf_story):
|
||||
if not should_create_txt_story and not should_create_rtf_story:
|
||||
raise ValueError('Invalid configuration for story parsing: No valid websites found')
|
||||
|
||||
for proc in psutil.process_iter(['cmdline']):
|
||||
if proc.info['cmdline'] and 'libreoffice' in proc.info['cmdline'][0] and '--writer' in proc.info['cmdline'][1:]:
|
||||
if ignore_empty_files:
|
||||
print('WARN: LibreOffice Writer appears to be running. This command may output empty files until it is closed.')
|
||||
break
|
||||
print('WARN: LibreOffice Writer appears to be running. This command may raise an error until it is closed.')
|
||||
break
|
||||
|
||||
story_filename = os.path.split(story_path)[1].rsplit('.')[0]
|
||||
txt_out_path = os.path.join(out_dir, f'{story_filename}.txt') if should_create_txt_story else os.devnull
|
||||
md_out_path = os.path.join(out_dir, f'{story_filename}.md') if should_create_md_story else os.devnull
|
||||
txt_tmp_path = os.path.join(temp_dir, f'{story_filename}.txt') if should_create_rtf_story else os.devnull
|
||||
RE_EMPTY_LINE = re.compile(r'^$')
|
||||
RE_SEQUENTIAL_EQUAL_SIGNS = re.compile(r'=(?==)')
|
||||
RE_EMPTY_LINE = re.compile('^$')
|
||||
is_only_empty_lines = True
|
||||
with subprocess.Popen(('libreoffice', '--cat', story_path), stdout=subprocess.PIPE) as ps:
|
||||
# Mangle output files so that .RTF will always have a single LF between lines, and .TXT/.MD can have one or two CRLF
|
||||
with open(txt_out_path, 'w', newline='\r\n') as txt_out, open(md_out_path, 'w', newline='\r\n') as md_out, open(txt_tmp_path, 'w') as txt_tmp:
|
||||
ps = subprocess.Popen(('libreoffice', '--cat', story_path), stdout=subprocess.PIPE)
|
||||
# Mangle output files so that .RTF will always have a single LF between lines, and .TXT can have one or two CRLF
|
||||
with open(txt_out_path, 'w', newline='\r\n') as txt_out, open(txt_tmp_path, 'w') as txt_tmp:
|
||||
needs_empty_line = False
|
||||
for line in io.TextIOWrapper(ps.stdout, encoding='utf-8-sig'):
|
||||
# Remove empty lines
|
||||
line = line.strip()
|
||||
md_line = line
|
||||
if RE_EMPTY_LINE.search(line) and not is_only_empty_lines:
|
||||
needs_empty_line = True
|
||||
else:
|
||||
if should_create_md_story:
|
||||
md_line = RE_SEQUENTIAL_EQUAL_SIGNS.sub('= ', line.replace(r'*', r'\*'))
|
||||
if is_only_empty_lines:
|
||||
txt_out.writelines((line,))
|
||||
md_out.writelines((md_line,))
|
||||
txt_tmp.writelines((line,))
|
||||
is_only_empty_lines = False
|
||||
else:
|
||||
if needs_empty_line:
|
||||
txt_out.writelines(('\n\n', line))
|
||||
md_out.writelines(('\n\n', md_line))
|
||||
needs_empty_line = False
|
||||
else:
|
||||
txt_out.writelines(('\n', line))
|
||||
md_out.writelines(('\n', md_line))
|
||||
txt_tmp.writelines(('\n', line))
|
||||
txt_out.writelines(('\n'))
|
||||
md_out.writelines(('\n'))
|
||||
if is_only_empty_lines:
|
||||
error = f'Story processing returned empty file: libreoffice --cat {story_path}'
|
||||
if ignore_empty_files:
|
||||
|
|
|
|||
55
test.py
55
test.py
|
|
@ -1,55 +0,0 @@
|
|||
#!/usr/bin/env python
|
||||
import glob
|
||||
import os.path
|
||||
from parameterized import parameterized
|
||||
import re
|
||||
import tempfile
|
||||
import unittest
|
||||
import warnings
|
||||
|
||||
from description import parse_description, DescriptionParsingError
|
||||
|
||||
class TestParseDescription(unittest.TestCase):
|
||||
config = {
|
||||
'aryion': 'UserAryion',
|
||||
'furaffinity': 'UserFuraffinity',
|
||||
'inkbunny': 'UserInkbunny',
|
||||
'sofurry': 'UserSoFurry',
|
||||
'weasyl': 'UserWeasyl',
|
||||
}
|
||||
define_options = {'test_parse_description'}
|
||||
|
||||
def setUp(self):
|
||||
self.tmpdir = tempfile.TemporaryDirectory(ignore_cleanup_errors=True)
|
||||
warnings.simplefilter('ignore', ResourceWarning)
|
||||
|
||||
def tearDown(self):
|
||||
self.tmpdir.cleanup()
|
||||
warnings.simplefilter('default', ResourceWarning)
|
||||
|
||||
@parameterized.expand([
|
||||
(re.match(r'.*(input_\d+)\.txt', v)[1], v) for v in sorted(glob.iglob('./test/description/input_*.txt'))
|
||||
])
|
||||
def test_parse_success(self, name, test_description):
|
||||
with tempfile.TemporaryDirectory(ignore_cleanup_errors=True) as tmpdir:
|
||||
parse_description(test_description, self.config, tmpdir, define_options=self.define_options)
|
||||
for expected_output_file in glob.iglob(f'./test/description/output_{name[6:]}/*'):
|
||||
received_output_file = os.path.join(tmpdir, os.path.split(expected_output_file)[1])
|
||||
self.assertTrue(os.path.exists(received_output_file))
|
||||
self.assertTrue(os.path.isfile(received_output_file))
|
||||
with open(received_output_file, 'r') as f:
|
||||
received_description = f.read()
|
||||
with open(expected_output_file, 'r') as f:
|
||||
expected_description = f.read()
|
||||
self.assertEqual(received_description, expected_description)
|
||||
|
||||
@parameterized.expand([
|
||||
(re.match(r'.*(error_.+)\.txt', v)[1], v) for v in sorted(glob.iglob('./test/description/error_*.txt'))
|
||||
])
|
||||
def test_parse_errors(self, _, test_description):
|
||||
self.assertRaises(DescriptionParsingError, lambda: parse_description(test_description, self.config, self.tmpdir.name, define_options=self.define_options))
|
||||
self.assertListEqual(glob.glob(os.path.join(self.tmpdir.name, '*')), [])
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
unittest.main()
|
||||
|
|
@ -1 +0,0 @@
|
|||
[url=https://example.com]Nested [url=https://example.net]URLs[/url][/url]
|
||||
|
|
@ -1 +0,0 @@
|
|||
ZERO[b]ONE[i]TWO[u]THREE[b]FOUR[url=https://example.com]FIVE[/url]FOUR[/b]THREE[/u]TWO[/i]ONE[/b]ZERO
|
||||
|
|
@ -1 +0,0 @@
|
|||
[i]Hello world!
|
||||
|
|
@ -1 +0,0 @@
|
|||
Hello world![/u]
|
||||
|
|
@ -1 +0,0 @@
|
|||
[user][unknown=Foo]Bar[/unknown][/user]
|
||||
|
|
@ -1,9 +0,0 @@
|
|||
[b]Hello world![/b]
|
||||
|
||||
This is just a [u]simple[/u] test to show that basic functionality of [url=https://github.com/BadMannersXYZ/upload-generator]upload-generator[/url] [i]works[/i]. [if=define==test_parse_description]And this is running in a unit test.[/if][else]Why did you parse this outside of a unit test?![/else]
|
||||
|
||||
[center]Reminder that I am [self][/self]![/center]
|
||||
|
||||
My friend: [user][sofurry=FriendSoFurry][fa=FriendFa][mastodon=@FriendMastodon@example.org]Friend123[/mastodon][/fa][/sofurry][/user][if=site in ib,aryion,weasyl] (I dunno his account here...)[/if]
|
||||
|
||||
[siteurl][eka=https://example.com/eka][inkbunny=https://example.com/ib][generic=https://example.com/generic]Check this page![/generic][/inkbunny][/eka][/siteurl]
|
||||
|
|
@ -1,12 +0,0 @@
|
|||
[self][/self]
|
||||
|
||||
[if=site==eka] -> [/if][user][eka=EkaPerson]EkaName[/eka][/user] [user][eka]EkaPerson[/eka][/user]
|
||||
[if=site==fa] -> [/if][user][fa=FaPerson]FaName[/fa][/user] [user][fa]FaPerson[/fa][/user]
|
||||
[if=site==ib] -> [/if][user][ib=IbPerson]IbName[/ib][/user] [user][ib]IbPerson[/ib][/user]
|
||||
[if=site==sofurry] -> [/if][user][sf=SfPerson]SfName[/sf][/user] [user][sf]SfPerson[/sf][/user]
|
||||
[if=site==weasyl] -> [/if][user][weasyl=WeasylPerson]WeasylName[/weasyl][/user] [user][weasyl]WeasylPerson[/weasyl][/user]
|
||||
[user][twitter=XPerson]XName[/twitter][/user] [user][twitter]XPerson[/twitter][/user]
|
||||
[user][mastodon=MastodonPerson@example.com]MastodonName[/mastodon][/user] [user][mastodon]MastodonPerson@example.com[/mastodon][/user]
|
||||
[user][twitter=Ignored][generic=https://example.net/GenericPerson]GenericName[/generic][/twitter][/user]
|
||||
|
||||
[siteurl][aryion=https://example.com/aryion][furaffinity=https://example.com/furaffinity][inkbunny=https://example.com/inkbunny][sofurry=https://example.com/sofurry][generic=https://example.com/generic]Link[/generic][/sofurry][/inkbunny][/furaffinity][/aryion][/siteurl]
|
||||
|
|
@ -1,9 +0,0 @@
|
|||
[b]Hello world![/b]
|
||||
|
||||
This is just a [u]simple[/u] test to show that basic functionality of [url=https://github.com/BadMannersXYZ/upload-generator]upload-generator[/url] [i]works[/i]. And this is running in a unit test.
|
||||
|
||||
[center]Reminder that I am :iconUserAryion:![/center]
|
||||
|
||||
My friend: [url=https://example.org/@FriendMastodon]Friend123[/url] (I dunno his account here...)
|
||||
|
||||
[url=https://example.com/eka]Check this page![/url]
|
||||
|
|
@ -1,9 +0,0 @@
|
|||
[b]Hello world![/b]
|
||||
|
||||
This is just a [u]simple[/u] test to show that basic functionality of [url=https://github.com/BadMannersXYZ/upload-generator]upload-generator[/url] [i]works[/i]. And this is running in a unit test.
|
||||
|
||||
[center]Reminder that I am :iconUserFuraffinity:![/center]
|
||||
|
||||
My friend: :iconFriendFa:
|
||||
|
||||
[url=https://example.com/generic]Check this page![/url]
|
||||
|
|
@ -1,9 +0,0 @@
|
|||
[b]Hello world![/b]
|
||||
|
||||
This is just a [u]simple[/u] test to show that basic functionality of [url=https://github.com/BadMannersXYZ/upload-generator]upload-generator[/url] [i]works[/i]. And this is running in a unit test.
|
||||
|
||||
[center]Reminder that I am [iconname]UserInkbunny[/iconname]![/center]
|
||||
|
||||
My friend: [fa]FriendFa[/fa] (I dunno his account here...)
|
||||
|
||||
[url=https://example.com/ib]Check this page![/url]
|
||||
|
|
@ -1,9 +0,0 @@
|
|||
[b]Hello world![/b]
|
||||
|
||||
This is just a [u]simple[/u] test to show that basic functionality of [url=https://github.com/BadMannersXYZ/upload-generator]upload-generator[/url] [i]works[/i]. And this is running in a unit test.
|
||||
|
||||
[center]Reminder that I am :iconUserSoFurry:![/center]
|
||||
|
||||
My friend: :iconFriendSoFurry:
|
||||
|
||||
[url=https://example.com/generic]Check this page![/url]
|
||||
|
|
@ -1,9 +0,0 @@
|
|||
**Hello world!**
|
||||
|
||||
This is just a <u>simple</u> test to show that basic functionality of [upload-generator](https://github.com/BadMannersXYZ/upload-generator) *works*. And this is running in a unit test.
|
||||
|
||||
<div class="align-center">Reminder that I am <!~UserWeasyl>!</div>
|
||||
|
||||
My friend: <fa:FriendFa> (I dunno his account here...)
|
||||
|
||||
[Check this page!](https://example.com/generic)
|
||||
|
|
@ -1,12 +0,0 @@
|
|||
:iconUserAryion:
|
||||
|
||||
-> :iconEkaPerson: :iconEkaPerson:
|
||||
[url=https://furaffinity.net/user/FaPerson]FaName[/url] [url=https://furaffinity.net/user/FaPerson]FaPerson[/url]
|
||||
[url=https://inkbunny.net/IbPerson]IbName[/url] [url=https://inkbunny.net/IbPerson]IbPerson[/url]
|
||||
[url=https://sfperson.sofurry.com]SfName[/url] [url=https://sfperson.sofurry.com]SfPerson[/url]
|
||||
[url=https://www.weasyl.com/~weasylperson]WeasylName[/url] [url=https://www.weasyl.com/~weasylperson]WeasylPerson[/url]
|
||||
[url=https://twitter.com/XPerson]XName[/url] [url=https://twitter.com/XPerson]XPerson[/url]
|
||||
[url=https://example.com/@MastodonPerson]MastodonName[/url] [url=https://example.com/@MastodonPerson]MastodonPerson@example.com[/url]
|
||||
[url=https://example.net/GenericPerson]GenericName[/url]
|
||||
|
||||
[url=https://example.com/aryion]Link[/url]
|
||||
|
|
@ -1,12 +0,0 @@
|
|||
:iconUserFuraffinity:
|
||||
|
||||
[url=https://aryion.com/g4/user/EkaPerson]EkaName[/url] [url=https://aryion.com/g4/user/EkaPerson]EkaPerson[/url]
|
||||
-> :iconFaPerson: :iconFaPerson:
|
||||
[url=https://inkbunny.net/IbPerson]IbName[/url] [url=https://inkbunny.net/IbPerson]IbPerson[/url]
|
||||
[url=https://sfperson.sofurry.com]SfName[/url] [url=https://sfperson.sofurry.com]SfPerson[/url]
|
||||
[url=https://www.weasyl.com/~weasylperson]WeasylName[/url] [url=https://www.weasyl.com/~weasylperson]WeasylPerson[/url]
|
||||
[url=https://twitter.com/XPerson]XName[/url] [url=https://twitter.com/XPerson]XPerson[/url]
|
||||
[url=https://example.com/@MastodonPerson]MastodonName[/url] [url=https://example.com/@MastodonPerson]MastodonPerson@example.com[/url]
|
||||
[url=https://example.net/GenericPerson]GenericName[/url]
|
||||
|
||||
[url=https://example.com/furaffinity]Link[/url]
|
||||
|
|
@ -1,12 +0,0 @@
|
|||
[iconname]UserInkbunny[/iconname]
|
||||
|
||||
[url=https://aryion.com/g4/user/EkaPerson]EkaName[/url] [url=https://aryion.com/g4/user/EkaPerson]EkaPerson[/url]
|
||||
[fa]FaPerson[/fa] [fa]FaPerson[/fa]
|
||||
-> [iconname]IbPerson[/iconname] [iconname]IbPerson[/iconname]
|
||||
[sf]SfPerson[/sf] [sf]SfPerson[/sf]
|
||||
[weasyl]weasylperson[/weasyl] [weasyl]weasylperson[/weasyl]
|
||||
[url=https://twitter.com/XPerson]XName[/url] [url=https://twitter.com/XPerson]XPerson[/url]
|
||||
[url=https://example.com/@MastodonPerson]MastodonName[/url] [url=https://example.com/@MastodonPerson]MastodonPerson@example.com[/url]
|
||||
[url=https://example.net/GenericPerson]GenericName[/url]
|
||||
|
||||
[url=https://example.com/inkbunny]Link[/url]
|
||||
|
|
@ -1,12 +0,0 @@
|
|||
:iconUserSoFurry:
|
||||
|
||||
[url=https://aryion.com/g4/user/EkaPerson]EkaName[/url] [url=https://aryion.com/g4/user/EkaPerson]EkaPerson[/url]
|
||||
fa!FaPerson fa!FaPerson
|
||||
ib!IbPerson ib!IbPerson
|
||||
-> :iconSfPerson: :iconSfPerson:
|
||||
[url=https://www.weasyl.com/~weasylperson]WeasylName[/url] [url=https://www.weasyl.com/~weasylperson]WeasylPerson[/url]
|
||||
[url=https://twitter.com/XPerson]XName[/url] [url=https://twitter.com/XPerson]XPerson[/url]
|
||||
[url=https://example.com/@MastodonPerson]MastodonName[/url] [url=https://example.com/@MastodonPerson]MastodonPerson@example.com[/url]
|
||||
[url=https://example.net/GenericPerson]GenericName[/url]
|
||||
|
||||
[url=https://example.com/sofurry]Link[/url]
|
||||
|
|
@ -1,12 +0,0 @@
|
|||
<!~UserWeasyl>
|
||||
|
||||
[EkaName](https://aryion.com/g4/user/EkaPerson) [EkaPerson](https://aryion.com/g4/user/EkaPerson)
|
||||
<fa:FaPerson> <fa:FaPerson>
|
||||
<ib:IbPerson> <ib:IbPerson>
|
||||
<sf:SfPerson> <sf:SfPerson>
|
||||
-> <!~WeasylPerson> <!~WeasylPerson>
|
||||
[XName](https://twitter.com/XPerson) [XPerson](https://twitter.com/XPerson)
|
||||
[MastodonName](https://example.com/@MastodonPerson) [MastodonPerson@example.com](https://example.com/@MastodonPerson)
|
||||
[GenericName](https://example.net/GenericPerson)
|
||||
|
||||
[Link](https://example.com/generic)
|
||||
Reference in a new issue