Compare commits
18 commits
Author | SHA1 | Date | |
---|---|---|---|
6dfa73a3d3 | |||
1fef22269e | |||
20ffbd4c6d | |||
190e38dfc1 | |||
107504050e | |||
36c7401fa8 | |||
a8ffce918e | |||
097f02938d | |||
8906d8f7f7 | |||
89d9dc3cbe | |||
a4134fa416 | |||
4bd0839669 | |||
7dbb4a522f | |||
fc6e1c59a5 | |||
9bef51a027 | |||
1f95a4f0c8 | |||
677fee5205 | |||
319cc2e6af |
54 changed files with 5051 additions and 729 deletions
661
LICENSE
Normal file
661
LICENSE
Normal file
|
@ -0,0 +1,661 @@
|
|||
GNU AFFERO GENERAL PUBLIC LICENSE
|
||||
Version 3, 19 November 2007
|
||||
|
||||
Copyright (C) 2007 Free Software Foundation, Inc. <https://fsf.org/>
|
||||
Everyone is permitted to copy and distribute verbatim copies
|
||||
of this license document, but changing it is not allowed.
|
||||
|
||||
Preamble
|
||||
|
||||
The GNU Affero General Public License is a free, copyleft license for
|
||||
software and other kinds of works, specifically designed to ensure
|
||||
cooperation with the community in the case of network server software.
|
||||
|
||||
The licenses for most software and other practical works are designed
|
||||
to take away your freedom to share and change the works. By contrast,
|
||||
our General Public Licenses are intended to guarantee your freedom to
|
||||
share and change all versions of a program--to make sure it remains free
|
||||
software for all its users.
|
||||
|
||||
When we speak of free software, we are referring to freedom, not
|
||||
price. Our General Public Licenses are designed to make sure that you
|
||||
have the freedom to distribute copies of free software (and charge for
|
||||
them if you wish), that you receive source code or can get it if you
|
||||
want it, that you can change the software or use pieces of it in new
|
||||
free programs, and that you know you can do these things.
|
||||
|
||||
Developers that use our General Public Licenses protect your rights
|
||||
with two steps: (1) assert copyright on the software, and (2) offer
|
||||
you this License which gives you legal permission to copy, distribute
|
||||
and/or modify the software.
|
||||
|
||||
A secondary benefit of defending all users' freedom is that
|
||||
improvements made in alternate versions of the program, if they
|
||||
receive widespread use, become available for other developers to
|
||||
incorporate. Many developers of free software are heartened and
|
||||
encouraged by the resulting cooperation. However, in the case of
|
||||
software used on network servers, this result may fail to come about.
|
||||
The GNU General Public License permits making a modified version and
|
||||
letting the public access it on a server without ever releasing its
|
||||
source code to the public.
|
||||
|
||||
The GNU Affero General Public License is designed specifically to
|
||||
ensure that, in such cases, the modified source code becomes available
|
||||
to the community. It requires the operator of a network server to
|
||||
provide the source code of the modified version running there to the
|
||||
users of that server. Therefore, public use of a modified version, on
|
||||
a publicly accessible server, gives the public access to the source
|
||||
code of the modified version.
|
||||
|
||||
An older license, called the Affero General Public License and
|
||||
published by Affero, was designed to accomplish similar goals. This is
|
||||
a different license, not a version of the Affero GPL, but Affero has
|
||||
released a new version of the Affero GPL which permits relicensing under
|
||||
this license.
|
||||
|
||||
The precise terms and conditions for copying, distribution and
|
||||
modification follow.
|
||||
|
||||
TERMS AND CONDITIONS
|
||||
|
||||
0. Definitions.
|
||||
|
||||
"This License" refers to version 3 of the GNU Affero General Public License.
|
||||
|
||||
"Copyright" also means copyright-like laws that apply to other kinds of
|
||||
works, such as semiconductor masks.
|
||||
|
||||
"The Program" refers to any copyrightable work licensed under this
|
||||
License. Each licensee is addressed as "you". "Licensees" and
|
||||
"recipients" may be individuals or organizations.
|
||||
|
||||
To "modify" a work means to copy from or adapt all or part of the work
|
||||
in a fashion requiring copyright permission, other than the making of an
|
||||
exact copy. The resulting work is called a "modified version" of the
|
||||
earlier work or a work "based on" the earlier work.
|
||||
|
||||
A "covered work" means either the unmodified Program or a work based
|
||||
on the Program.
|
||||
|
||||
To "propagate" a work means to do anything with it that, without
|
||||
permission, would make you directly or secondarily liable for
|
||||
infringement under applicable copyright law, except executing it on a
|
||||
computer or modifying a private copy. Propagation includes copying,
|
||||
distribution (with or without modification), making available to the
|
||||
public, and in some countries other activities as well.
|
||||
|
||||
To "convey" a work means any kind of propagation that enables other
|
||||
parties to make or receive copies. Mere interaction with a user through
|
||||
a computer network, with no transfer of a copy, is not conveying.
|
||||
|
||||
An interactive user interface displays "Appropriate Legal Notices"
|
||||
to the extent that it includes a convenient and prominently visible
|
||||
feature that (1) displays an appropriate copyright notice, and (2)
|
||||
tells the user that there is no warranty for the work (except to the
|
||||
extent that warranties are provided), that licensees may convey the
|
||||
work under this License, and how to view a copy of this License. If
|
||||
the interface presents a list of user commands or options, such as a
|
||||
menu, a prominent item in the list meets this criterion.
|
||||
|
||||
1. Source Code.
|
||||
|
||||
The "source code" for a work means the preferred form of the work
|
||||
for making modifications to it. "Object code" means any non-source
|
||||
form of a work.
|
||||
|
||||
A "Standard Interface" means an interface that either is an official
|
||||
standard defined by a recognized standards body, or, in the case of
|
||||
interfaces specified for a particular programming language, one that
|
||||
is widely used among developers working in that language.
|
||||
|
||||
The "System Libraries" of an executable work include anything, other
|
||||
than the work as a whole, that (a) is included in the normal form of
|
||||
packaging a Major Component, but which is not part of that Major
|
||||
Component, and (b) serves only to enable use of the work with that
|
||||
Major Component, or to implement a Standard Interface for which an
|
||||
implementation is available to the public in source code form. A
|
||||
"Major Component", in this context, means a major essential component
|
||||
(kernel, window system, and so on) of the specific operating system
|
||||
(if any) on which the executable work runs, or a compiler used to
|
||||
produce the work, or an object code interpreter used to run it.
|
||||
|
||||
The "Corresponding Source" for a work in object code form means all
|
||||
the source code needed to generate, install, and (for an executable
|
||||
work) run the object code and to modify the work, including scripts to
|
||||
control those activities. However, it does not include the work's
|
||||
System Libraries, or general-purpose tools or generally available free
|
||||
programs which are used unmodified in performing those activities but
|
||||
which are not part of the work. For example, Corresponding Source
|
||||
includes interface definition files associated with source files for
|
||||
the work, and the source code for shared libraries and dynamically
|
||||
linked subprograms that the work is specifically designed to require,
|
||||
such as by intimate data communication or control flow between those
|
||||
subprograms and other parts of the work.
|
||||
|
||||
The Corresponding Source need not include anything that users
|
||||
can regenerate automatically from other parts of the Corresponding
|
||||
Source.
|
||||
|
||||
The Corresponding Source for a work in source code form is that
|
||||
same work.
|
||||
|
||||
2. Basic Permissions.
|
||||
|
||||
All rights granted under this License are granted for the term of
|
||||
copyright on the Program, and are irrevocable provided the stated
|
||||
conditions are met. This License explicitly affirms your unlimited
|
||||
permission to run the unmodified Program. The output from running a
|
||||
covered work is covered by this License only if the output, given its
|
||||
content, constitutes a covered work. This License acknowledges your
|
||||
rights of fair use or other equivalent, as provided by copyright law.
|
||||
|
||||
You may make, run and propagate covered works that you do not
|
||||
convey, without conditions so long as your license otherwise remains
|
||||
in force. You may convey covered works to others for the sole purpose
|
||||
of having them make modifications exclusively for you, or provide you
|
||||
with facilities for running those works, provided that you comply with
|
||||
the terms of this License in conveying all material for which you do
|
||||
not control copyright. Those thus making or running the covered works
|
||||
for you must do so exclusively on your behalf, under your direction
|
||||
and control, on terms that prohibit them from making any copies of
|
||||
your copyrighted material outside their relationship with you.
|
||||
|
||||
Conveying under any other circumstances is permitted solely under
|
||||
the conditions stated below. Sublicensing is not allowed; section 10
|
||||
makes it unnecessary.
|
||||
|
||||
3. Protecting Users' Legal Rights From Anti-Circumvention Law.
|
||||
|
||||
No covered work shall be deemed part of an effective technological
|
||||
measure under any applicable law fulfilling obligations under article
|
||||
11 of the WIPO copyright treaty adopted on 20 December 1996, or
|
||||
similar laws prohibiting or restricting circumvention of such
|
||||
measures.
|
||||
|
||||
When you convey a covered work, you waive any legal power to forbid
|
||||
circumvention of technological measures to the extent such circumvention
|
||||
is effected by exercising rights under this License with respect to
|
||||
the covered work, and you disclaim any intention to limit operation or
|
||||
modification of the work as a means of enforcing, against the work's
|
||||
users, your or third parties' legal rights to forbid circumvention of
|
||||
technological measures.
|
||||
|
||||
4. Conveying Verbatim Copies.
|
||||
|
||||
You may convey verbatim copies of the Program's source code as you
|
||||
receive it, in any medium, provided that you conspicuously and
|
||||
appropriately publish on each copy an appropriate copyright notice;
|
||||
keep intact all notices stating that this License and any
|
||||
non-permissive terms added in accord with section 7 apply to the code;
|
||||
keep intact all notices of the absence of any warranty; and give all
|
||||
recipients a copy of this License along with the Program.
|
||||
|
||||
You may charge any price or no price for each copy that you convey,
|
||||
and you may offer support or warranty protection for a fee.
|
||||
|
||||
5. Conveying Modified Source Versions.
|
||||
|
||||
You may convey a work based on the Program, or the modifications to
|
||||
produce it from the Program, in the form of source code under the
|
||||
terms of section 4, provided that you also meet all of these conditions:
|
||||
|
||||
a) The work must carry prominent notices stating that you modified
|
||||
it, and giving a relevant date.
|
||||
|
||||
b) The work must carry prominent notices stating that it is
|
||||
released under this License and any conditions added under section
|
||||
7. This requirement modifies the requirement in section 4 to
|
||||
"keep intact all notices".
|
||||
|
||||
c) You must license the entire work, as a whole, under this
|
||||
License to anyone who comes into possession of a copy. This
|
||||
License will therefore apply, along with any applicable section 7
|
||||
additional terms, to the whole of the work, and all its parts,
|
||||
regardless of how they are packaged. This License gives no
|
||||
permission to license the work in any other way, but it does not
|
||||
invalidate such permission if you have separately received it.
|
||||
|
||||
d) If the work has interactive user interfaces, each must display
|
||||
Appropriate Legal Notices; however, if the Program has interactive
|
||||
interfaces that do not display Appropriate Legal Notices, your
|
||||
work need not make them do so.
|
||||
|
||||
A compilation of a covered work with other separate and independent
|
||||
works, which are not by their nature extensions of the covered work,
|
||||
and which are not combined with it such as to form a larger program,
|
||||
in or on a volume of a storage or distribution medium, is called an
|
||||
"aggregate" if the compilation and its resulting copyright are not
|
||||
used to limit the access or legal rights of the compilation's users
|
||||
beyond what the individual works permit. Inclusion of a covered work
|
||||
in an aggregate does not cause this License to apply to the other
|
||||
parts of the aggregate.
|
||||
|
||||
6. Conveying Non-Source Forms.
|
||||
|
||||
You may convey a covered work in object code form under the terms
|
||||
of sections 4 and 5, provided that you also convey the
|
||||
machine-readable Corresponding Source under the terms of this License,
|
||||
in one of these ways:
|
||||
|
||||
a) Convey the object code in, or embodied in, a physical product
|
||||
(including a physical distribution medium), accompanied by the
|
||||
Corresponding Source fixed on a durable physical medium
|
||||
customarily used for software interchange.
|
||||
|
||||
b) Convey the object code in, or embodied in, a physical product
|
||||
(including a physical distribution medium), accompanied by a
|
||||
written offer, valid for at least three years and valid for as
|
||||
long as you offer spare parts or customer support for that product
|
||||
model, to give anyone who possesses the object code either (1) a
|
||||
copy of the Corresponding Source for all the software in the
|
||||
product that is covered by this License, on a durable physical
|
||||
medium customarily used for software interchange, for a price no
|
||||
more than your reasonable cost of physically performing this
|
||||
conveying of source, or (2) access to copy the
|
||||
Corresponding Source from a network server at no charge.
|
||||
|
||||
c) Convey individual copies of the object code with a copy of the
|
||||
written offer to provide the Corresponding Source. This
|
||||
alternative is allowed only occasionally and noncommercially, and
|
||||
only if you received the object code with such an offer, in accord
|
||||
with subsection 6b.
|
||||
|
||||
d) Convey the object code by offering access from a designated
|
||||
place (gratis or for a charge), and offer equivalent access to the
|
||||
Corresponding Source in the same way through the same place at no
|
||||
further charge. You need not require recipients to copy the
|
||||
Corresponding Source along with the object code. If the place to
|
||||
copy the object code is a network server, the Corresponding Source
|
||||
may be on a different server (operated by you or a third party)
|
||||
that supports equivalent copying facilities, provided you maintain
|
||||
clear directions next to the object code saying where to find the
|
||||
Corresponding Source. Regardless of what server hosts the
|
||||
Corresponding Source, you remain obligated to ensure that it is
|
||||
available for as long as needed to satisfy these requirements.
|
||||
|
||||
e) Convey the object code using peer-to-peer transmission, provided
|
||||
you inform other peers where the object code and Corresponding
|
||||
Source of the work are being offered to the general public at no
|
||||
charge under subsection 6d.
|
||||
|
||||
A separable portion of the object code, whose source code is excluded
|
||||
from the Corresponding Source as a System Library, need not be
|
||||
included in conveying the object code work.
|
||||
|
||||
A "User Product" is either (1) a "consumer product", which means any
|
||||
tangible personal property which is normally used for personal, family,
|
||||
or household purposes, or (2) anything designed or sold for incorporation
|
||||
into a dwelling. In determining whether a product is a consumer product,
|
||||
doubtful cases shall be resolved in favor of coverage. For a particular
|
||||
product received by a particular user, "normally used" refers to a
|
||||
typical or common use of that class of product, regardless of the status
|
||||
of the particular user or of the way in which the particular user
|
||||
actually uses, or expects or is expected to use, the product. A product
|
||||
is a consumer product regardless of whether the product has substantial
|
||||
commercial, industrial or non-consumer uses, unless such uses represent
|
||||
the only significant mode of use of the product.
|
||||
|
||||
"Installation Information" for a User Product means any methods,
|
||||
procedures, authorization keys, or other information required to install
|
||||
and execute modified versions of a covered work in that User Product from
|
||||
a modified version of its Corresponding Source. The information must
|
||||
suffice to ensure that the continued functioning of the modified object
|
||||
code is in no case prevented or interfered with solely because
|
||||
modification has been made.
|
||||
|
||||
If you convey an object code work under this section in, or with, or
|
||||
specifically for use in, a User Product, and the conveying occurs as
|
||||
part of a transaction in which the right of possession and use of the
|
||||
User Product is transferred to the recipient in perpetuity or for a
|
||||
fixed term (regardless of how the transaction is characterized), the
|
||||
Corresponding Source conveyed under this section must be accompanied
|
||||
by the Installation Information. But this requirement does not apply
|
||||
if neither you nor any third party retains the ability to install
|
||||
modified object code on the User Product (for example, the work has
|
||||
been installed in ROM).
|
||||
|
||||
The requirement to provide Installation Information does not include a
|
||||
requirement to continue to provide support service, warranty, or updates
|
||||
for a work that has been modified or installed by the recipient, or for
|
||||
the User Product in which it has been modified or installed. Access to a
|
||||
network may be denied when the modification itself materially and
|
||||
adversely affects the operation of the network or violates the rules and
|
||||
protocols for communication across the network.
|
||||
|
||||
Corresponding Source conveyed, and Installation Information provided,
|
||||
in accord with this section must be in a format that is publicly
|
||||
documented (and with an implementation available to the public in
|
||||
source code form), and must require no special password or key for
|
||||
unpacking, reading or copying.
|
||||
|
||||
7. Additional Terms.
|
||||
|
||||
"Additional permissions" are terms that supplement the terms of this
|
||||
License by making exceptions from one or more of its conditions.
|
||||
Additional permissions that are applicable to the entire Program shall
|
||||
be treated as though they were included in this License, to the extent
|
||||
that they are valid under applicable law. If additional permissions
|
||||
apply only to part of the Program, that part may be used separately
|
||||
under those permissions, but the entire Program remains governed by
|
||||
this License without regard to the additional permissions.
|
||||
|
||||
When you convey a copy of a covered work, you may at your option
|
||||
remove any additional permissions from that copy, or from any part of
|
||||
it. (Additional permissions may be written to require their own
|
||||
removal in certain cases when you modify the work.) You may place
|
||||
additional permissions on material, added by you to a covered work,
|
||||
for which you have or can give appropriate copyright permission.
|
||||
|
||||
Notwithstanding any other provision of this License, for material you
|
||||
add to a covered work, you may (if authorized by the copyright holders of
|
||||
that material) supplement the terms of this License with terms:
|
||||
|
||||
a) Disclaiming warranty or limiting liability differently from the
|
||||
terms of sections 15 and 16 of this License; or
|
||||
|
||||
b) Requiring preservation of specified reasonable legal notices or
|
||||
author attributions in that material or in the Appropriate Legal
|
||||
Notices displayed by works containing it; or
|
||||
|
||||
c) Prohibiting misrepresentation of the origin of that material, or
|
||||
requiring that modified versions of such material be marked in
|
||||
reasonable ways as different from the original version; or
|
||||
|
||||
d) Limiting the use for publicity purposes of names of licensors or
|
||||
authors of the material; or
|
||||
|
||||
e) Declining to grant rights under trademark law for use of some
|
||||
trade names, trademarks, or service marks; or
|
||||
|
||||
f) Requiring indemnification of licensors and authors of that
|
||||
material by anyone who conveys the material (or modified versions of
|
||||
it) with contractual assumptions of liability to the recipient, for
|
||||
any liability that these contractual assumptions directly impose on
|
||||
those licensors and authors.
|
||||
|
||||
All other non-permissive additional terms are considered "further
|
||||
restrictions" within the meaning of section 10. If the Program as you
|
||||
received it, or any part of it, contains a notice stating that it is
|
||||
governed by this License along with a term that is a further
|
||||
restriction, you may remove that term. If a license document contains
|
||||
a further restriction but permits relicensing or conveying under this
|
||||
License, you may add to a covered work material governed by the terms
|
||||
of that license document, provided that the further restriction does
|
||||
not survive such relicensing or conveying.
|
||||
|
||||
If you add terms to a covered work in accord with this section, you
|
||||
must place, in the relevant source files, a statement of the
|
||||
additional terms that apply to those files, or a notice indicating
|
||||
where to find the applicable terms.
|
||||
|
||||
Additional terms, permissive or non-permissive, may be stated in the
|
||||
form of a separately written license, or stated as exceptions;
|
||||
the above requirements apply either way.
|
||||
|
||||
8. Termination.
|
||||
|
||||
You may not propagate or modify a covered work except as expressly
|
||||
provided under this License. Any attempt otherwise to propagate or
|
||||
modify it is void, and will automatically terminate your rights under
|
||||
this License (including any patent licenses granted under the third
|
||||
paragraph of section 11).
|
||||
|
||||
However, if you cease all violation of this License, then your
|
||||
license from a particular copyright holder is reinstated (a)
|
||||
provisionally, unless and until the copyright holder explicitly and
|
||||
finally terminates your license, and (b) permanently, if the copyright
|
||||
holder fails to notify you of the violation by some reasonable means
|
||||
prior to 60 days after the cessation.
|
||||
|
||||
Moreover, your license from a particular copyright holder is
|
||||
reinstated permanently if the copyright holder notifies you of the
|
||||
violation by some reasonable means, this is the first time you have
|
||||
received notice of violation of this License (for any work) from that
|
||||
copyright holder, and you cure the violation prior to 30 days after
|
||||
your receipt of the notice.
|
||||
|
||||
Termination of your rights under this section does not terminate the
|
||||
licenses of parties who have received copies or rights from you under
|
||||
this License. If your rights have been terminated and not permanently
|
||||
reinstated, you do not qualify to receive new licenses for the same
|
||||
material under section 10.
|
||||
|
||||
9. Acceptance Not Required for Having Copies.
|
||||
|
||||
You are not required to accept this License in order to receive or
|
||||
run a copy of the Program. Ancillary propagation of a covered work
|
||||
occurring solely as a consequence of using peer-to-peer transmission
|
||||
to receive a copy likewise does not require acceptance. However,
|
||||
nothing other than this License grants you permission to propagate or
|
||||
modify any covered work. These actions infringe copyright if you do
|
||||
not accept this License. Therefore, by modifying or propagating a
|
||||
covered work, you indicate your acceptance of this License to do so.
|
||||
|
||||
10. Automatic Licensing of Downstream Recipients.
|
||||
|
||||
Each time you convey a covered work, the recipient automatically
|
||||
receives a license from the original licensors, to run, modify and
|
||||
propagate that work, subject to this License. You are not responsible
|
||||
for enforcing compliance by third parties with this License.
|
||||
|
||||
An "entity transaction" is a transaction transferring control of an
|
||||
organization, or substantially all assets of one, or subdividing an
|
||||
organization, or merging organizations. If propagation of a covered
|
||||
work results from an entity transaction, each party to that
|
||||
transaction who receives a copy of the work also receives whatever
|
||||
licenses to the work the party's predecessor in interest had or could
|
||||
give under the previous paragraph, plus a right to possession of the
|
||||
Corresponding Source of the work from the predecessor in interest, if
|
||||
the predecessor has it or can get it with reasonable efforts.
|
||||
|
||||
You may not impose any further restrictions on the exercise of the
|
||||
rights granted or affirmed under this License. For example, you may
|
||||
not impose a license fee, royalty, or other charge for exercise of
|
||||
rights granted under this License, and you may not initiate litigation
|
||||
(including a cross-claim or counterclaim in a lawsuit) alleging that
|
||||
any patent claim is infringed by making, using, selling, offering for
|
||||
sale, or importing the Program or any portion of it.
|
||||
|
||||
11. Patents.
|
||||
|
||||
A "contributor" is a copyright holder who authorizes use under this
|
||||
License of the Program or a work on which the Program is based. The
|
||||
work thus licensed is called the contributor's "contributor version".
|
||||
|
||||
A contributor's "essential patent claims" are all patent claims
|
||||
owned or controlled by the contributor, whether already acquired or
|
||||
hereafter acquired, that would be infringed by some manner, permitted
|
||||
by this License, of making, using, or selling its contributor version,
|
||||
but do not include claims that would be infringed only as a
|
||||
consequence of further modification of the contributor version. For
|
||||
purposes of this definition, "control" includes the right to grant
|
||||
patent sublicenses in a manner consistent with the requirements of
|
||||
this License.
|
||||
|
||||
Each contributor grants you a non-exclusive, worldwide, royalty-free
|
||||
patent license under the contributor's essential patent claims, to
|
||||
make, use, sell, offer for sale, import and otherwise run, modify and
|
||||
propagate the contents of its contributor version.
|
||||
|
||||
In the following three paragraphs, a "patent license" is any express
|
||||
agreement or commitment, however denominated, not to enforce a patent
|
||||
(such as an express permission to practice a patent or covenant not to
|
||||
sue for patent infringement). To "grant" such a patent license to a
|
||||
party means to make such an agreement or commitment not to enforce a
|
||||
patent against the party.
|
||||
|
||||
If you convey a covered work, knowingly relying on a patent license,
|
||||
and the Corresponding Source of the work is not available for anyone
|
||||
to copy, free of charge and under the terms of this License, through a
|
||||
publicly available network server or other readily accessible means,
|
||||
then you must either (1) cause the Corresponding Source to be so
|
||||
available, or (2) arrange to deprive yourself of the benefit of the
|
||||
patent license for this particular work, or (3) arrange, in a manner
|
||||
consistent with the requirements of this License, to extend the patent
|
||||
license to downstream recipients. "Knowingly relying" means you have
|
||||
actual knowledge that, but for the patent license, your conveying the
|
||||
covered work in a country, or your recipient's use of the covered work
|
||||
in a country, would infringe one or more identifiable patents in that
|
||||
country that you have reason to believe are valid.
|
||||
|
||||
If, pursuant to or in connection with a single transaction or
|
||||
arrangement, you convey, or propagate by procuring conveyance of, a
|
||||
covered work, and grant a patent license to some of the parties
|
||||
receiving the covered work authorizing them to use, propagate, modify
|
||||
or convey a specific copy of the covered work, then the patent license
|
||||
you grant is automatically extended to all recipients of the covered
|
||||
work and works based on it.
|
||||
|
||||
A patent license is "discriminatory" if it does not include within
|
||||
the scope of its coverage, prohibits the exercise of, or is
|
||||
conditioned on the non-exercise of one or more of the rights that are
|
||||
specifically granted under this License. You may not convey a covered
|
||||
work if you are a party to an arrangement with a third party that is
|
||||
in the business of distributing software, under which you make payment
|
||||
to the third party based on the extent of your activity of conveying
|
||||
the work, and under which the third party grants, to any of the
|
||||
parties who would receive the covered work from you, a discriminatory
|
||||
patent license (a) in connection with copies of the covered work
|
||||
conveyed by you (or copies made from those copies), or (b) primarily
|
||||
for and in connection with specific products or compilations that
|
||||
contain the covered work, unless you entered into that arrangement,
|
||||
or that patent license was granted, prior to 28 March 2007.
|
||||
|
||||
Nothing in this License shall be construed as excluding or limiting
|
||||
any implied license or other defenses to infringement that may
|
||||
otherwise be available to you under applicable patent law.
|
||||
|
||||
12. No Surrender of Others' Freedom.
|
||||
|
||||
If conditions are imposed on you (whether by court order, agreement or
|
||||
otherwise) that contradict the conditions of this License, they do not
|
||||
excuse you from the conditions of this License. If you cannot convey a
|
||||
covered work so as to satisfy simultaneously your obligations under this
|
||||
License and any other pertinent obligations, then as a consequence you may
|
||||
not convey it at all. For example, if you agree to terms that obligate you
|
||||
to collect a royalty for further conveying from those to whom you convey
|
||||
the Program, the only way you could satisfy both those terms and this
|
||||
License would be to refrain entirely from conveying the Program.
|
||||
|
||||
13. Remote Network Interaction; Use with the GNU General Public License.
|
||||
|
||||
Notwithstanding any other provision of this License, if you modify the
|
||||
Program, your modified version must prominently offer all users
|
||||
interacting with it remotely through a computer network (if your version
|
||||
supports such interaction) an opportunity to receive the Corresponding
|
||||
Source of your version by providing access to the Corresponding Source
|
||||
from a network server at no charge, through some standard or customary
|
||||
means of facilitating copying of software. This Corresponding Source
|
||||
shall include the Corresponding Source for any work covered by version 3
|
||||
of the GNU General Public License that is incorporated pursuant to the
|
||||
following paragraph.
|
||||
|
||||
Notwithstanding any other provision of this License, you have
|
||||
permission to link or combine any covered work with a work licensed
|
||||
under version 3 of the GNU General Public License into a single
|
||||
combined work, and to convey the resulting work. The terms of this
|
||||
License will continue to apply to the part which is the covered work,
|
||||
but the work with which it is combined will remain governed by version
|
||||
3 of the GNU General Public License.
|
||||
|
||||
14. Revised Versions of this License.
|
||||
|
||||
The Free Software Foundation may publish revised and/or new versions of
|
||||
the GNU Affero General Public License from time to time. Such new versions
|
||||
will be similar in spirit to the present version, but may differ in detail to
|
||||
address new problems or concerns.
|
||||
|
||||
Each version is given a distinguishing version number. If the
|
||||
Program specifies that a certain numbered version of the GNU Affero General
|
||||
Public License "or any later version" applies to it, you have the
|
||||
option of following the terms and conditions either of that numbered
|
||||
version or of any later version published by the Free Software
|
||||
Foundation. If the Program does not specify a version number of the
|
||||
GNU Affero General Public License, you may choose any version ever published
|
||||
by the Free Software Foundation.
|
||||
|
||||
If the Program specifies that a proxy can decide which future
|
||||
versions of the GNU Affero General Public License can be used, that proxy's
|
||||
public statement of acceptance of a version permanently authorizes you
|
||||
to choose that version for the Program.
|
||||
|
||||
Later license versions may give you additional or different
|
||||
permissions. However, no additional obligations are imposed on any
|
||||
author or copyright holder as a result of your choosing to follow a
|
||||
later version.
|
||||
|
||||
15. Disclaimer of Warranty.
|
||||
|
||||
THERE IS NO WARRANTY FOR THE PROGRAM, TO THE EXTENT PERMITTED BY
|
||||
APPLICABLE LAW. EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT
|
||||
HOLDERS AND/OR OTHER PARTIES PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY
|
||||
OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO,
|
||||
THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
|
||||
PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE PROGRAM
|
||||
IS WITH YOU. SHOULD THE PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF
|
||||
ALL NECESSARY SERVICING, REPAIR OR CORRECTION.
|
||||
|
||||
16. Limitation of Liability.
|
||||
|
||||
IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING
|
||||
WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MODIFIES AND/OR CONVEYS
|
||||
THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES, INCLUDING ANY
|
||||
GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING OUT OF THE
|
||||
USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED TO LOSS OF
|
||||
DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD
|
||||
PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER PROGRAMS),
|
||||
EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF
|
||||
SUCH DAMAGES.
|
||||
|
||||
17. Interpretation of Sections 15 and 16.
|
||||
|
||||
If the disclaimer of warranty and limitation of liability provided
|
||||
above cannot be given local legal effect according to their terms,
|
||||
reviewing courts shall apply local law that most closely approximates
|
||||
an absolute waiver of all civil liability in connection with the
|
||||
Program, unless a warranty or assumption of liability accompanies a
|
||||
copy of the Program in return for a fee.
|
||||
|
||||
END OF TERMS AND CONDITIONS
|
||||
|
||||
How to Apply These Terms to Your New Programs
|
||||
|
||||
If you develop a new program, and you want it to be of the greatest
|
||||
possible use to the public, the best way to achieve this is to make it
|
||||
free software which everyone can redistribute and change under these terms.
|
||||
|
||||
To do so, attach the following notices to the program. It is safest
|
||||
to attach them to the start of each source file to most effectively
|
||||
state the exclusion of warranty; and each file should have at least
|
||||
the "copyright" line and a pointer to where the full notice is found.
|
||||
|
||||
<one line to give the program's name and a brief idea of what it does.>
|
||||
Copyright (C) <year> <name of author>
|
||||
|
||||
This program is free software: you can redistribute it and/or modify
|
||||
it under the terms of the GNU Affero General Public License as published by
|
||||
the Free Software Foundation, either version 3 of the License, or
|
||||
(at your option) any later version.
|
||||
|
||||
This program is distributed in the hope that it will be useful,
|
||||
but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
GNU Affero General Public License for more details.
|
||||
|
||||
You should have received a copy of the GNU Affero General Public License
|
||||
along with this program. If not, see <https://www.gnu.org/licenses/>.
|
||||
|
||||
Also add information on how to contact you by electronic and paper mail.
|
||||
|
||||
If your software can interact with users remotely through a computer
|
||||
network, you should also make sure that it provides a way for users to
|
||||
get its source. For example, if your program is a web application, its
|
||||
interface could display a "Source" link that leads users to an archive
|
||||
of the code. There are many ways you could offer source, and different
|
||||
solutions will be better for different programs; see section 13 for the
|
||||
specific requirements.
|
||||
|
||||
You should also get your employer (if you work as a programmer) or school,
|
||||
if any, to sign a "copyright disclaimer" for the program, if necessary.
|
||||
For more information on this, and how to apply and follow the GNU AGPL, see
|
||||
<https://www.gnu.org/licenses/>.
|
14
Makefile
Normal file
14
Makefile
Normal file
|
@ -0,0 +1,14 @@
|
|||
TARGETS = x86_64-unknown-linux-gnu aarch64-unknown-linux-gnu x86_64-apple-darwin aarch64-apple-darwin
|
||||
|
||||
.PHONY: all clean
|
||||
|
||||
all: $(TARGETS:%=dist/relay-%)
|
||||
|
||||
dist/relay-%: | dist/
|
||||
deno compile -A -r --target $* --include migrations --include public --no-lock --output $@ src/index.ts
|
||||
|
||||
dist/:
|
||||
mkdir -p dist
|
||||
|
||||
clean:
|
||||
rm -f dist/*
|
28
Readme.md
Normal file
28
Readme.md
Normal file
|
@ -0,0 +1,28 @@
|
|||
# EVE Relay
|
||||
|
||||
> ⚠️ ALPHA STAGE DISCLAIMER: EVE is currently in early alpha development. Many
|
||||
> features described here are still in development or planned for future
|
||||
> releases. The platform is rapidly evolving, and you may encounter bugs,
|
||||
> incomplete functionality, or significant changes between versions. We welcome
|
||||
> early adopters and contributors who share our vision, but please be aware of
|
||||
> the platform's developmental status.
|
||||
|
||||
# Requirements
|
||||
|
||||
- Deno v2.2.1 or higher
|
||||
```bash
|
||||
curl -fsSL https://deno.land/install.sh | sh
|
||||
```
|
||||
|
||||
# Getting Started
|
||||
|
||||
```
|
||||
# Clone the Eve-Relay repo
|
||||
git clone https://git.arx-ccn.com/Arx/Eve-Relay Relay
|
||||
|
||||
# Navigate into the Relay directory
|
||||
cd Relay
|
||||
|
||||
# start the dev relay
|
||||
deno task dev
|
||||
```
|
44
biome.json
Normal file
44
biome.json
Normal file
|
@ -0,0 +1,44 @@
|
|||
{
|
||||
"$schema": "https://biomejs.dev/schemas/1.9.4/schema.json",
|
||||
"files": {
|
||||
"include": ["**/*.ts"]
|
||||
},
|
||||
"vcs": {
|
||||
"enabled": true,
|
||||
"clientKind": "git",
|
||||
"useIgnoreFile": true
|
||||
},
|
||||
"linter": {
|
||||
"enabled": true,
|
||||
"rules": {
|
||||
"style": {
|
||||
"noNonNullAssertion": "off",
|
||||
"useNodejsImportProtocol": "warn"
|
||||
},
|
||||
"complexity": {
|
||||
"useLiteralKeys": "off"
|
||||
}
|
||||
}
|
||||
},
|
||||
"formatter": {
|
||||
"enabled": true,
|
||||
"formatWithErrors": true,
|
||||
"ignore": [],
|
||||
"attributePosition": "auto",
|
||||
"indentStyle": "space",
|
||||
"indentWidth": 2,
|
||||
"lineWidth": 80,
|
||||
"lineEnding": "lf"
|
||||
},
|
||||
"javascript": {
|
||||
"formatter": {
|
||||
"arrowParentheses": "always",
|
||||
"bracketSameLine": true,
|
||||
"bracketSpacing": true,
|
||||
"quoteStyle": "single",
|
||||
"quoteProperties": "asNeeded",
|
||||
"semicolons": "always",
|
||||
"trailingCommas": "all"
|
||||
}
|
||||
}
|
||||
}
|
15
deno.json
15
deno.json
|
@ -1,24 +1,21 @@
|
|||
{
|
||||
"tasks": {
|
||||
"dev": "deno run --allow-read --allow-write --allow-net --allow-ffi --allow-env --env-file --watch index.ts"
|
||||
"dev": "deno run --allow-read --allow-write --allow-net --allow-ffi --allow-env --env-file --watch src/index.ts",
|
||||
"lint": "biome check",
|
||||
"lint:fix": "biome check --write --unsafe"
|
||||
},
|
||||
"imports": {
|
||||
"@biomejs/biome": "npm:@biomejs/biome@^1.9.4",
|
||||
"@db/sqlite": "jsr:@db/sqlite@^0.12.0",
|
||||
"@noble/ciphers": "jsr:@noble/ciphers@^1.2.1",
|
||||
"@noble/hashes": "jsr:@noble/hashes@^1.8.0",
|
||||
"@nostr/tools": "jsr:@nostr/tools@^2.10.4",
|
||||
"@nostrify/nostrify": "jsr:@nostrify/nostrify@^0.37.0",
|
||||
"@nostrify/types": "jsr:@nostrify/types@^0.36.0",
|
||||
"@scure/base": "jsr:@scure/base@^1.2.4",
|
||||
"@std/encoding": "jsr:@std/encoding@^1.0.6",
|
||||
"@std/fmt": "jsr:@std/fmt@^1.0.4",
|
||||
"@std/log": "jsr:@std/log@^0.224.13",
|
||||
"@types/deno": "npm:@types/deno@^2.0.0"
|
||||
},
|
||||
"fmt": {
|
||||
"indentWidth": 2,
|
||||
"useTabs": false,
|
||||
"lineWidth": 80,
|
||||
"proseWrap": "always",
|
||||
"semiColons": true,
|
||||
"singleQuote": false
|
||||
}
|
||||
}
|
||||
|
|
40
deno.lock
generated
40
deno.lock
generated
|
@ -30,6 +30,8 @@
|
|||
"jsr:@std/path@0.217": "0.217.0",
|
||||
"jsr:@std/path@0.221": "0.221.0",
|
||||
"jsr:@std/path@^1.0.8": "1.0.8",
|
||||
"npm:@biomejs/biome@1.9.4": "1.9.4",
|
||||
"npm:@biomejs/biome@^1.9.4": "1.9.4",
|
||||
"npm:@noble/ciphers@~0.5.1": "0.5.3",
|
||||
"npm:@noble/curves@1.2.0": "1.2.0",
|
||||
"npm:@noble/hashes@1.3.1": "1.3.1",
|
||||
|
@ -168,6 +170,43 @@
|
|||
}
|
||||
},
|
||||
"npm": {
|
||||
"@biomejs/biome@1.9.4": {
|
||||
"integrity": "sha512-1rkd7G70+o9KkTn5KLmDYXihGoTaIGO9PIIN2ZB7UJxFrWw04CZHPYiMRjYsaDvVV7hP1dYNRLxSANLaBFGpog==",
|
||||
"dependencies": [
|
||||
"@biomejs/cli-darwin-arm64",
|
||||
"@biomejs/cli-darwin-x64",
|
||||
"@biomejs/cli-linux-arm64",
|
||||
"@biomejs/cli-linux-arm64-musl",
|
||||
"@biomejs/cli-linux-x64",
|
||||
"@biomejs/cli-linux-x64-musl",
|
||||
"@biomejs/cli-win32-arm64",
|
||||
"@biomejs/cli-win32-x64"
|
||||
]
|
||||
},
|
||||
"@biomejs/cli-darwin-arm64@1.9.4": {
|
||||
"integrity": "sha512-bFBsPWrNvkdKrNCYeAp+xo2HecOGPAy9WyNyB/jKnnedgzl4W4Hb9ZMzYNbf8dMCGmUdSavlYHiR01QaYR58cw=="
|
||||
},
|
||||
"@biomejs/cli-darwin-x64@1.9.4": {
|
||||
"integrity": "sha512-ngYBh/+bEedqkSevPVhLP4QfVPCpb+4BBe2p7Xs32dBgs7rh9nY2AIYUL6BgLw1JVXV8GlpKmb/hNiuIxfPfZg=="
|
||||
},
|
||||
"@biomejs/cli-linux-arm64-musl@1.9.4": {
|
||||
"integrity": "sha512-v665Ct9WCRjGa8+kTr0CzApU0+XXtRgwmzIf1SeKSGAv+2scAlW6JR5PMFo6FzqqZ64Po79cKODKf3/AAmECqA=="
|
||||
},
|
||||
"@biomejs/cli-linux-arm64@1.9.4": {
|
||||
"integrity": "sha512-fJIW0+LYujdjUgJJuwesP4EjIBl/N/TcOX3IvIHJQNsAqvV2CHIogsmA94BPG6jZATS4Hi+xv4SkBBQSt1N4/g=="
|
||||
},
|
||||
"@biomejs/cli-linux-x64-musl@1.9.4": {
|
||||
"integrity": "sha512-gEhi/jSBhZ2m6wjV530Yy8+fNqG8PAinM3oV7CyO+6c3CEh16Eizm21uHVsyVBEB6RIM8JHIl6AGYCv6Q6Q9Tg=="
|
||||
},
|
||||
"@biomejs/cli-linux-x64@1.9.4": {
|
||||
"integrity": "sha512-lRCJv/Vi3Vlwmbd6K+oQ0KhLHMAysN8lXoCI7XeHlxaajk06u7G+UsFSO01NAs5iYuWKmVZjmiOzJ0OJmGsMwg=="
|
||||
},
|
||||
"@biomejs/cli-win32-arm64@1.9.4": {
|
||||
"integrity": "sha512-tlbhLk+WXZmgwoIKwHIHEBZUwxml7bRJgk0X2sPyNR3S93cdRq6XulAZRQJ17FYGGzWne0fgrXBKpl7l4M87Hg=="
|
||||
},
|
||||
"@biomejs/cli-win32-x64@1.9.4": {
|
||||
"integrity": "sha512-8Y5wMhVIPaWe6jw2H+KlEm4wP/f7EW3810ZLmDlrEEy5KvBsb9ECEfu/kMWD484ijfQ8+nIi0giMgu9g1UAuuA=="
|
||||
},
|
||||
"@noble/ciphers@0.5.3": {
|
||||
"integrity": "sha512-B0+6IIHiqEs3BPMT0hcRmHvEj2QHOLu+uwt+tqDDeVd0oyVzh7BPrDcPjRnV1PV/5LaknXJJQvOuRGR0zQJz+w=="
|
||||
},
|
||||
|
@ -272,6 +311,7 @@
|
|||
"jsr:@std/encoding@^1.0.6",
|
||||
"jsr:@std/fmt@^1.0.4",
|
||||
"jsr:@std/log@~0.224.13",
|
||||
"npm:@biomejs/biome@^1.9.4",
|
||||
"npm:@types/deno@2"
|
||||
]
|
||||
}
|
||||
|
|
581
index.ts
581
index.ts
|
@ -1,581 +0,0 @@
|
|||
import { NSchema as n } from "jsr:@nostrify/nostrify";
|
||||
import type {
|
||||
NostrClientREQ,
|
||||
NostrEvent,
|
||||
NostrFilter,
|
||||
} from "jsr:@nostrify/types";
|
||||
import {
|
||||
getCCNPrivateKey,
|
||||
getCCNPubkey,
|
||||
isArray,
|
||||
isLocalhost,
|
||||
isValidJSON,
|
||||
randomTimeUpTo2DaysInThePast,
|
||||
} from "./utils.ts";
|
||||
import * as nostrTools from "@nostr/tools";
|
||||
import { nip44 } from "@nostr/tools";
|
||||
import { randomBytes } from "@noble/ciphers/webcrypto";
|
||||
import { encodeBase64 } from "jsr:@std/encoding@0.224/base64";
|
||||
import { Database } from "jsr:@db/sqlite";
|
||||
import { mixQuery, sql, sqlPartial } from "./utils/queries.ts";
|
||||
import { log, setupLogger } from "./utils/logs.ts";
|
||||
import { getEveFilePath } from "./utils/files.ts";
|
||||
|
||||
await setupLogger();
|
||||
|
||||
if (!Deno.env.has("ENCRYPTION_KEY")) {
|
||||
log.error(
|
||||
`Missing ENCRYPTION_KEY. Please set it in your env.\nA new one has been generated for you: ENCRYPTION_KEY="${
|
||||
encodeBase64(
|
||||
randomBytes(32),
|
||||
)
|
||||
}"`,
|
||||
);
|
||||
Deno.exit(1);
|
||||
}
|
||||
|
||||
const db = new Database(await getEveFilePath("db"));
|
||||
const pool = new nostrTools.SimplePool();
|
||||
const relays = [
|
||||
"wss://relay.arx-ccn.com/",
|
||||
"wss://relay.dannymorabito.com/",
|
||||
"wss://nos.lol/",
|
||||
"wss://nostr.einundzwanzig.space/",
|
||||
"wss://nostr.massmux.com/",
|
||||
"wss://nostr.mom/",
|
||||
"wss://nostr.wine/",
|
||||
"wss://purplerelay.com/",
|
||||
"wss://relay.damus.io/",
|
||||
"wss://relay.goodmorningbitcoin.com/",
|
||||
"wss://relay.lexingtonbitcoin.org/",
|
||||
"wss://relay.nostr.band/",
|
||||
"wss://relay.primal.net/",
|
||||
"wss://relay.snort.social/",
|
||||
"wss://strfry.iris.to/",
|
||||
"wss://cache2.primal.net/v1",
|
||||
];
|
||||
|
||||
export function runMigrations(db: Database, latestVersion: number) {
|
||||
const migrations = Deno.readDirSync(`${import.meta.dirname}/migrations`);
|
||||
for (const migrationFile of migrations) {
|
||||
const migrationVersion = Number.parseInt(
|
||||
migrationFile.name.split("-")[0],
|
||||
10,
|
||||
);
|
||||
|
||||
if (migrationVersion > latestVersion) {
|
||||
log.info(
|
||||
`Running migration ${migrationFile.name} (version ${migrationVersion})`,
|
||||
);
|
||||
const start = Date.now();
|
||||
const migrationSql = Deno.readTextFileSync(
|
||||
`${import.meta.dirname}/migrations/${migrationFile.name}`,
|
||||
);
|
||||
db.run("BEGIN TRANSACTION");
|
||||
try {
|
||||
db.run(migrationSql);
|
||||
const end = Date.now();
|
||||
const durationMs = end - start;
|
||||
sql`
|
||||
INSERT INTO migration_history (migration_version, migration_name, executed_at, duration_ms, status) VALUES (${migrationVersion}, ${migrationFile.name}, ${
|
||||
new Date().toISOString()
|
||||
}, ${durationMs}, 'success');
|
||||
db.run("COMMIT TRANSACTION");
|
||||
`(db);
|
||||
} catch (e) {
|
||||
db.run("ROLLBACK TRANSACTION");
|
||||
const error = e instanceof Error
|
||||
? e
|
||||
: typeof e === "string"
|
||||
? new Error(e)
|
||||
: new Error(JSON.stringify(e));
|
||||
const end = Date.now();
|
||||
const durationMs = end - start;
|
||||
sql`
|
||||
INSERT INTO migration_history (migration_version, migration_name, executed_at, duration_ms, status, error_message) VALUES (${migrationVersion}, ${migrationFile.name}, ${
|
||||
new Date().toISOString()
|
||||
}, ${durationMs}, 'failed', ${error.message});
|
||||
`(db);
|
||||
throw e;
|
||||
}
|
||||
db.run("END TRANSACTION");
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
async function createEncryptedEvent(
|
||||
event: nostrTools.VerifiedEvent,
|
||||
): Promise<nostrTools.VerifiedEvent> {
|
||||
if (!event.id) throw new Error("Event must have an ID");
|
||||
if (!event.sig) throw new Error("Event must be signed");
|
||||
const ccnPubKey = await getCCNPubkey();
|
||||
const ccnPrivateKey = await getCCNPrivateKey();
|
||||
const randomPrivateKey = nostrTools.generateSecretKey();
|
||||
const conversationKey = nip44.getConversationKey(randomPrivateKey, ccnPubKey);
|
||||
const sealTemplate = {
|
||||
kind: 13,
|
||||
created_at: randomTimeUpTo2DaysInThePast(),
|
||||
content: nip44.encrypt(JSON.stringify(event), conversationKey),
|
||||
tags: [],
|
||||
};
|
||||
const seal = nostrTools.finalizeEvent(sealTemplate, ccnPrivateKey);
|
||||
const giftWrapTemplate = {
|
||||
kind: 1059,
|
||||
created_at: randomTimeUpTo2DaysInThePast(),
|
||||
content: nip44.encrypt(JSON.stringify(seal), conversationKey),
|
||||
tags: [["p", ccnPubKey]],
|
||||
};
|
||||
const giftWrap = nostrTools.finalizeEvent(giftWrapTemplate, randomPrivateKey);
|
||||
return giftWrap;
|
||||
}
|
||||
|
||||
async function decryptEvent(
|
||||
event: nostrTools.Event,
|
||||
): Promise<nostrTools.VerifiedEvent> {
|
||||
const ccnPrivkey = await getCCNPrivateKey();
|
||||
|
||||
if (event.kind !== 1059) {
|
||||
throw new Error("Cannot decrypt event -- not a gift wrap");
|
||||
}
|
||||
|
||||
const conversationKey = nip44.getConversationKey(ccnPrivkey, event.pubkey);
|
||||
const seal = JSON.parse(nip44.decrypt(event.content, conversationKey));
|
||||
if (!seal) throw new Error("Cannot decrypt event -- no seal");
|
||||
if (seal.kind !== 13) {
|
||||
throw new Error("Cannot decrypt event subevent -- not a seal");
|
||||
}
|
||||
const content = JSON.parse(nip44.decrypt(seal.content, conversationKey));
|
||||
return content as nostrTools.VerifiedEvent;
|
||||
}
|
||||
|
||||
class EventAlreadyExistsException extends Error {}
|
||||
|
||||
function addEventToDb(
|
||||
decryptedEvent: nostrTools.VerifiedEvent,
|
||||
encryptedEvent: nostrTools.VerifiedEvent,
|
||||
) {
|
||||
const existingEvent = sql`
|
||||
SELECT * FROM events WHERE id = ${decryptedEvent.id}
|
||||
`(db)[0];
|
||||
|
||||
if (existingEvent) throw new EventAlreadyExistsException();
|
||||
try {
|
||||
db.run("BEGIN TRANSACTION");
|
||||
sql`
|
||||
INSERT INTO events (id, original_id, pubkey, created_at, kind, content, sig, first_seen) VALUES (
|
||||
${decryptedEvent.id},
|
||||
${encryptedEvent.id},
|
||||
${decryptedEvent.pubkey},
|
||||
${decryptedEvent.created_at},
|
||||
${decryptedEvent.kind},
|
||||
${decryptedEvent.content},
|
||||
${decryptedEvent.sig},
|
||||
unixepoch()
|
||||
)
|
||||
`(db);
|
||||
if (decryptedEvent.tags) {
|
||||
for (let i = 0; i < decryptedEvent.tags.length; i++) {
|
||||
const tag = sql`
|
||||
INSERT INTO event_tags(event_id, tag_name, tag_index) VALUES (
|
||||
${decryptedEvent.id},
|
||||
${decryptedEvent.tags[i][0]},
|
||||
${i}
|
||||
) RETURNING tag_id
|
||||
`(db)[0];
|
||||
for (let j = 1; j < decryptedEvent.tags[i].length; j++) {
|
||||
sql`
|
||||
INSERT INTO event_tags_values(tag_id, value_position, value) VALUES (
|
||||
${tag.tag_id},
|
||||
${j},
|
||||
${decryptedEvent.tags[i][j]}
|
||||
)
|
||||
`(db);
|
||||
}
|
||||
}
|
||||
}
|
||||
db.run("COMMIT TRANSACTION");
|
||||
} catch (e) {
|
||||
db.run("ROLLBACK TRANSACTION");
|
||||
throw e;
|
||||
}
|
||||
}
|
||||
|
||||
function encryptedEventIsInDb(event: nostrTools.VerifiedEvent) {
|
||||
return sql`
|
||||
SELECT * FROM events WHERE original_id = ${event.id}
|
||||
`(db)[0];
|
||||
}
|
||||
|
||||
async function setupAndSubscribeToExternalEvents() {
|
||||
const ccnPubkey = await getCCNPubkey();
|
||||
|
||||
const isInitialized = sql`
|
||||
SELECT name FROM sqlite_master WHERE type='table' AND name='migration_history'
|
||||
`(db)[0];
|
||||
|
||||
if (!isInitialized) runMigrations(db, -1);
|
||||
|
||||
const latestVersion = sql`
|
||||
SELECT migration_version FROM migration_history WHERE status = 'success' ORDER BY migration_version DESC LIMIT 1
|
||||
`(db)[0]?.migration_version ?? -1;
|
||||
|
||||
runMigrations(db, latestVersion);
|
||||
|
||||
pool.subscribeMany(
|
||||
relays,
|
||||
[
|
||||
{
|
||||
"#p": [ccnPubkey],
|
||||
kinds: [1059],
|
||||
},
|
||||
],
|
||||
{
|
||||
async onevent(event: nostrTools.Event) {
|
||||
if (timer) {
|
||||
timerCleaned = true;
|
||||
clearTimeout(timer);
|
||||
}
|
||||
if (knownOriginalEvents.indexOf(event.id) >= 0) return;
|
||||
if (!nostrTools.verifyEvent(event)) {
|
||||
log.warn("Invalid event received");
|
||||
return;
|
||||
}
|
||||
if (encryptedEventIsInDb(event)) return;
|
||||
const decryptedEvent = await decryptEvent(event);
|
||||
try {
|
||||
addEventToDb(decryptedEvent, event);
|
||||
} catch (e) {
|
||||
if (e instanceof EventAlreadyExistsException) return;
|
||||
}
|
||||
},
|
||||
},
|
||||
);
|
||||
|
||||
let timerCleaned = false;
|
||||
|
||||
const knownOriginalEvents = sql`SELECT original_id FROM events`(db).flatMap(
|
||||
(row) => row.original_id,
|
||||
);
|
||||
|
||||
const timer = setTimeout(async () => {
|
||||
// if nothing is found in 10 seconds, create a new CCN, TODO: change logic
|
||||
const ccnCreationEventTemplate = {
|
||||
kind: 0,
|
||||
content: JSON.stringify({
|
||||
display_name: "New CCN",
|
||||
name: "New CCN",
|
||||
bot: true,
|
||||
}),
|
||||
created_at: Math.floor(Date.now() / 1000),
|
||||
tags: [["p", ccnPubkey]],
|
||||
};
|
||||
const ccnCreationEvent = nostrTools.finalizeEvent(
|
||||
ccnCreationEventTemplate,
|
||||
await getCCNPrivateKey(),
|
||||
);
|
||||
const encryptedCCNCreationEvent = await createEncryptedEvent(
|
||||
ccnCreationEvent,
|
||||
);
|
||||
if (timerCleaned) return; // in case we get an event before the timer is cleaned
|
||||
await Promise.any(pool.publish(relays, encryptedCCNCreationEvent));
|
||||
}, 10000);
|
||||
}
|
||||
|
||||
await setupAndSubscribeToExternalEvents();
|
||||
|
||||
class UserConnection {
|
||||
public socket: WebSocket;
|
||||
public subscriptions: Map<string, NostrFilter[]>;
|
||||
public db: Database;
|
||||
|
||||
constructor(
|
||||
socket: WebSocket,
|
||||
subscriptions: Map<string, NostrFilter[]>,
|
||||
db: Database,
|
||||
) {
|
||||
this.socket = socket;
|
||||
this.subscriptions = subscriptions;
|
||||
this.db = db;
|
||||
}
|
||||
}
|
||||
|
||||
function filtersMatchingEvent(
|
||||
event: NostrEvent,
|
||||
connection: UserConnection,
|
||||
): string[] {
|
||||
const matching = [];
|
||||
for (const subscription of connection.subscriptions.keys()) {
|
||||
const filters = connection.subscriptions.get(subscription);
|
||||
if (!filters) continue;
|
||||
const isMatching = filters.every((filter) =>
|
||||
Object.entries(filter).every(([type, value]) => {
|
||||
if (type === "ids") return value.includes(event.id);
|
||||
if (type === "kinds") return value.includes(event.kind);
|
||||
if (type === "authors") return value.includes(event.pubkey);
|
||||
if (type === "since") return event.created_at >= value;
|
||||
if (type === "until") return event.created_at <= value;
|
||||
if (type === "limit") return event.created_at <= value;
|
||||
if (type.startsWith("#")) {
|
||||
const tagName = type.slice(1);
|
||||
return event.tags.some(
|
||||
(tag: string[]) => tag[0] === tagName && value.includes(tag[1]),
|
||||
);
|
||||
}
|
||||
return false;
|
||||
})
|
||||
);
|
||||
if (isMatching) matching.push(subscription);
|
||||
}
|
||||
return matching;
|
||||
}
|
||||
|
||||
function handleRequest(connection: UserConnection, request: NostrClientREQ) {
|
||||
const [, subscriptionId, ...filters] = request;
|
||||
if (connection.subscriptions.has(subscriptionId)) {
|
||||
return log.warn("Duplicate subscription ID");
|
||||
}
|
||||
|
||||
log.info(
|
||||
`New subscription: ${subscriptionId} with filters: ${
|
||||
JSON.stringify(
|
||||
filters,
|
||||
)
|
||||
}`,
|
||||
);
|
||||
|
||||
let query = sqlPartial`SELECT * FROM events`;
|
||||
|
||||
const filtersAreNotEmpty = filters.some((filter) => {
|
||||
return Object.values(filter).some((value) => {
|
||||
return value.length > 0;
|
||||
});
|
||||
});
|
||||
|
||||
if (filtersAreNotEmpty) {
|
||||
query = mixQuery(query, sqlPartial`WHERE`);
|
||||
|
||||
for (let i = 0; i < filters.length; i++) {
|
||||
// filters act as OR, filter groups act as AND
|
||||
query = mixQuery(query, sqlPartial`(`);
|
||||
|
||||
const filter = Object.entries(filters[i]).filter(([type, value]) => {
|
||||
if (type === "ids") return value.length > 0;
|
||||
if (type === "authors") return value.length > 0;
|
||||
if (type === "kinds") return value.length > 0;
|
||||
if (type.startsWith("#")) return value.length > 0;
|
||||
if (type === "since") return value > 0;
|
||||
if (type === "until") return value > 0;
|
||||
return false;
|
||||
});
|
||||
|
||||
for (let j = 0; j < filter.length; j++) {
|
||||
const [type, value] = filter[j];
|
||||
|
||||
if (type === "ids") {
|
||||
const uniqueIds = [...new Set(value)];
|
||||
query = mixQuery(query, sqlPartial`id IN (`);
|
||||
for (let k = 0; k < uniqueIds.length; k++) {
|
||||
const id = uniqueIds[k] as string;
|
||||
|
||||
query = mixQuery(query, sqlPartial`${id}`);
|
||||
|
||||
if (k < uniqueIds.length - 1) {
|
||||
query = mixQuery(query, sqlPartial`,`);
|
||||
}
|
||||
}
|
||||
query = mixQuery(query, sqlPartial`)`);
|
||||
}
|
||||
|
||||
if (type === "authors") {
|
||||
const uniqueAuthors = [...new Set(value)];
|
||||
query = mixQuery(query, sqlPartial`pubkey IN (`);
|
||||
for (let k = 0; k < uniqueAuthors.length; k++) {
|
||||
const author = uniqueAuthors[k] as string;
|
||||
|
||||
query = mixQuery(query, sqlPartial`${author}`);
|
||||
|
||||
if (k < uniqueAuthors.length - 1) {
|
||||
query = mixQuery(query, sqlPartial`,`);
|
||||
}
|
||||
}
|
||||
query = mixQuery(query, sqlPartial`)`);
|
||||
}
|
||||
|
||||
if (type === "kinds") {
|
||||
const uniqueKinds = [...new Set(value)];
|
||||
query = mixQuery(query, sqlPartial`kind IN (`);
|
||||
for (let k = 0; k < uniqueKinds.length; k++) {
|
||||
const kind = uniqueKinds[k] as number;
|
||||
|
||||
query = mixQuery(query, sqlPartial`${kind}`);
|
||||
|
||||
if (k < uniqueKinds.length - 1) {
|
||||
query = mixQuery(query, sqlPartial`,`);
|
||||
}
|
||||
}
|
||||
query = mixQuery(query, sqlPartial`)`);
|
||||
}
|
||||
|
||||
if (type.startsWith("#")) {
|
||||
const tag = type.slice(1);
|
||||
const uniqueValues = [...new Set(value)];
|
||||
query = mixQuery(query, sqlPartial`(`);
|
||||
for (let k = 0; k < uniqueValues.length; k++) {
|
||||
const value = uniqueValues[k] as string;
|
||||
|
||||
query = mixQuery(
|
||||
query,
|
||||
sqlPartial`id IN (
|
||||
SELECT t.event_id
|
||||
FROM event_tags t
|
||||
WHERE t.tag_name = ${tag}
|
||||
AND t.tag_id IN (
|
||||
SELECT v.tag_id
|
||||
FROM event_tags_values v
|
||||
WHERE v.value_position = 1
|
||||
AND v.value = ${value}
|
||||
)
|
||||
)`,
|
||||
);
|
||||
if (k < uniqueValues.length - 1) {
|
||||
query = mixQuery(query, sqlPartial`OR`);
|
||||
}
|
||||
}
|
||||
query = mixQuery(query, sqlPartial`)`);
|
||||
}
|
||||
|
||||
if (type === "since") {
|
||||
query = mixQuery(query, sqlPartial`created_at >= ${value}`);
|
||||
}
|
||||
|
||||
if (type === "until") {
|
||||
query = mixQuery(query, sqlPartial`created_at <= ${value}`);
|
||||
}
|
||||
|
||||
if (j < filter.length - 1) query = mixQuery(query, sqlPartial`AND`);
|
||||
}
|
||||
|
||||
query = mixQuery(query, sqlPartial`)`);
|
||||
|
||||
if (i < filters.length - 1) query = mixQuery(query, sqlPartial`OR`);
|
||||
}
|
||||
}
|
||||
|
||||
query = mixQuery(query, sqlPartial`ORDER BY created_at ASC`);
|
||||
|
||||
log.debug(query.query, ...query.values);
|
||||
|
||||
const events = connection.db.prepare(query.query).all(...query.values);
|
||||
|
||||
for (let i = 0; i < events.length; i++) {
|
||||
const rawTags = sql`SELECT * FROM event_tags_view WHERE event_id = ${
|
||||
events[i].id
|
||||
}`(connection.db);
|
||||
const tags: { [key: string]: string[] } = {};
|
||||
for (const item of rawTags) {
|
||||
if (!tags[item.tag_name]) tags[item.tag_name] = [item.tag_name];
|
||||
tags[item.tag_name].push(item.tag_value);
|
||||
}
|
||||
const tagsArray = Object.values(tags);
|
||||
|
||||
const event = {
|
||||
id: events[i].id,
|
||||
pubkey: events[i].pubkey,
|
||||
created_at: events[i].created_at,
|
||||
kind: events[i].kind,
|
||||
tags: tagsArray,
|
||||
content: events[i].content,
|
||||
sig: events[i].sig,
|
||||
};
|
||||
|
||||
connection.socket.send(JSON.stringify(["EVENT", subscriptionId, event]));
|
||||
}
|
||||
connection.socket.send(JSON.stringify(["EOSE", subscriptionId]));
|
||||
|
||||
connection.subscriptions.set(subscriptionId, filters);
|
||||
}
|
||||
|
||||
async function handleEvent(
|
||||
connection: UserConnection,
|
||||
event: nostrTools.Event,
|
||||
) {
|
||||
const valid = nostrTools.verifyEvent(event);
|
||||
if (!valid) {
|
||||
connection.socket.send(JSON.stringify(["NOTICE", "Invalid event"]));
|
||||
return log.warn("Invalid event");
|
||||
}
|
||||
|
||||
const encryptedEvent = await createEncryptedEvent(event);
|
||||
try {
|
||||
addEventToDb(event, encryptedEvent);
|
||||
} catch (e) {
|
||||
if (e instanceof EventAlreadyExistsException) {
|
||||
log.warn("Event already exists");
|
||||
return;
|
||||
}
|
||||
}
|
||||
await Promise.any(pool.publish(relays, encryptedEvent));
|
||||
|
||||
connection.socket.send(JSON.stringify(["OK", event.id, true, "Event added"]));
|
||||
|
||||
const filtersThatMatchEvent = filtersMatchingEvent(event, connection);
|
||||
|
||||
for (let i = 0; i < filtersThatMatchEvent.length; i++) {
|
||||
const filter = filtersThatMatchEvent[i];
|
||||
connection.socket.send(JSON.stringify(["EVENT", filter, event]));
|
||||
}
|
||||
}
|
||||
|
||||
function handleClose(connection: UserConnection, subscriptionId: string) {
|
||||
if (!connection.subscriptions.has(subscriptionId)) {
|
||||
return log.warn(
|
||||
`Closing unknown subscription? That's weird. Subscription ID: ${subscriptionId}`,
|
||||
);
|
||||
}
|
||||
|
||||
connection.subscriptions.delete(subscriptionId);
|
||||
}
|
||||
|
||||
Deno.serve({
|
||||
port: 6942,
|
||||
handler: (request) => {
|
||||
if (request.headers.get("upgrade") === "websocket") {
|
||||
if (!isLocalhost(request)) {
|
||||
return new Response(
|
||||
"Forbidden. Please read the Arx-CCN documentation for more information on how to interact with the relay.",
|
||||
{ status: 403 },
|
||||
);
|
||||
}
|
||||
|
||||
const { socket, response } = Deno.upgradeWebSocket(request);
|
||||
|
||||
const connection = new UserConnection(socket, new Map(), db);
|
||||
|
||||
socket.onopen = () => log.info("User connected");
|
||||
socket.onmessage = (event) => {
|
||||
log.debug(`Received: ${event.data}`);
|
||||
if (typeof event.data !== "string" || !isValidJSON(event.data)) {
|
||||
return log.warn("Invalid request");
|
||||
}
|
||||
const data = JSON.parse(event.data);
|
||||
if (!isArray(data)) return log.warn("Invalid request");
|
||||
|
||||
const msg = n.clientMsg().parse(data);
|
||||
switch (msg[0]) {
|
||||
case "REQ":
|
||||
return handleRequest(connection, n.clientREQ().parse(data));
|
||||
case "EVENT":
|
||||
return handleEvent(connection, n.clientEVENT().parse(data)[1]);
|
||||
case "CLOSE":
|
||||
return handleClose(connection, n.clientCLOSE().parse(data)[1]);
|
||||
default:
|
||||
return log.warn("Invalid request");
|
||||
}
|
||||
};
|
||||
socket.onclose = () => log.info("User disconnected");
|
||||
|
||||
return response;
|
||||
}
|
||||
return new Response("Eve Relay");
|
||||
},
|
||||
});
|
4
migrations/3-replaceableAndDeleteableEvents.sql
Normal file
4
migrations/3-replaceableAndDeleteableEvents.sql
Normal file
|
@ -0,0 +1,4 @@
|
|||
ALTER TABLE events
|
||||
ADD COLUMN replaced INTEGER NOT NULL DEFAULT 0;
|
||||
ALTER TABLE events
|
||||
ADD COLUMN deleted INTEGER NOT NULL DEFAULT 0;
|
13
migrations/4-createChunksStore.sql
Normal file
13
migrations/4-createChunksStore.sql
Normal file
|
@ -0,0 +1,13 @@
|
|||
CREATE TABLE event_chunks (
|
||||
chunk_id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||
message_id TEXT NOT NULL,
|
||||
chunk_index INTEGER NOT NULL,
|
||||
total_chunks INTEGER NOT NULL,
|
||||
chunk_data TEXT NOT NULL,
|
||||
conversation_key TEXT NOT NULL,
|
||||
created_at INTEGER NOT NULL,
|
||||
UNIQUE(message_id, chunk_index)
|
||||
);
|
||||
|
||||
CREATE INDEX idx_event_chunks_message_id ON event_chunks(message_id);
|
||||
CREATE INDEX idx_event_chunks_created_at ON event_chunks(created_at);
|
21
migrations/5-multiCCN.sql
Normal file
21
migrations/5-multiCCN.sql
Normal file
|
@ -0,0 +1,21 @@
|
|||
CREATE TABLE ccns (
|
||||
ccn_id TEXT PRIMARY KEY DEFAULT (lower(hex(randomblob(16)))),
|
||||
pubkey TEXT NOT NULL UNIQUE,
|
||||
name TEXT NOT NULL,
|
||||
created_at INTEGER NOT NULL DEFAULT (unixepoch()),
|
||||
is_active INTEGER NOT NULL DEFAULT 1
|
||||
);
|
||||
|
||||
ALTER TABLE events
|
||||
ADD COLUMN ccn_pubkey TEXT;
|
||||
|
||||
CREATE INDEX idx_events_ccn_pubkey ON events(ccn_pubkey);
|
||||
|
||||
ALTER TABLE event_chunks RENAME COLUMN chunk_data TO content;
|
||||
ALTER TABLE event_chunks ADD COLUMN ccn_pubkey TEXT;
|
||||
ALTER TABLE event_chunks DROP COLUMN conversation_key;
|
||||
CREATE INDEX idx_event_chunks_ccn_pubkey ON event_chunks(ccn_pubkey);
|
||||
|
||||
UPDATE ccns SET is_active = 0;
|
||||
UPDATE ccns SET is_active = 1
|
||||
WHERE pubkey = (SELECT pubkey FROM ccns LIMIT 1);
|
24
migrations/6-invitations.sql
Normal file
24
migrations/6-invitations.sql
Normal file
|
@ -0,0 +1,24 @@
|
|||
CREATE TABLE inviter_invitee(
|
||||
id TEXT PRIMARY KEY NOT NULL DEFAULT (lower(hex(randomblob(16)))),
|
||||
ccn_pubkey TEXT NOT NULL,
|
||||
inviter_pubkey TEXT NOT NULL,
|
||||
invitee_pubkey TEXT NOT NULL,
|
||||
invite_hash TEXT NOT NULL,
|
||||
created_at INTEGER NOT NULL DEFAULT (unixepoch()),
|
||||
FOREIGN KEY (ccn_pubkey) REFERENCES ccns(pubkey) ON DELETE CASCADE
|
||||
);
|
||||
|
||||
CREATE INDEX idx_inviter_invitee_ccn_pubkey ON inviter_invitee(ccn_pubkey);
|
||||
CREATE INDEX idx_inviter_invitee_inviter_pubkey ON inviter_invitee(inviter_pubkey);
|
||||
CREATE INDEX idx_inviter_invitee_invitee_pubkey ON inviter_invitee(invitee_pubkey);
|
||||
|
||||
CREATE TABLE allowed_writes (
|
||||
id TEXT PRIMARY KEY NOT NULL DEFAULT (lower(hex(randomblob(16)))),
|
||||
ccn_pubkey TEXT NOT NULL,
|
||||
pubkey TEXT NOT NULL,
|
||||
created_at INTEGER NOT NULL DEFAULT (unixepoch()),
|
||||
FOREIGN KEY (ccn_pubkey) REFERENCES ccns(pubkey) ON DELETE CASCADE
|
||||
);
|
||||
|
||||
CREATE INDEX idx_allowed_writes_ccn_pubkey ON allowed_writes(ccn_pubkey);
|
||||
CREATE INDEX idx_allowed_writes_pubkey ON allowed_writes(pubkey);
|
32
migrations/7-createLogsTable.sql
Normal file
32
migrations/7-createLogsTable.sql
Normal file
|
@ -0,0 +1,32 @@
|
|||
CREATE TABLE logs (
|
||||
log_id TEXT PRIMARY KEY DEFAULT (lower(hex(randomblob(16)))),
|
||||
timestamp TEXT NOT NULL,
|
||||
level TEXT NOT NULL CHECK (level IN ('DEBUG', 'INFO', 'WARN', 'ERROR', 'FATAL')),
|
||||
message TEXT NOT NULL,
|
||||
args TEXT, -- JSON string of log arguments
|
||||
source TEXT, -- tag or source component
|
||||
created_at INTEGER NOT NULL DEFAULT (unixepoch()),
|
||||
-- Security-specific fields
|
||||
event_type TEXT, -- For security events
|
||||
severity TEXT, -- For security events
|
||||
remote_addr TEXT,
|
||||
ccn_pubkey TEXT,
|
||||
event_id TEXT,
|
||||
risk_score REAL
|
||||
);
|
||||
|
||||
CREATE INDEX idx_logs_timestamp ON logs(timestamp);
|
||||
CREATE INDEX idx_logs_level ON logs(level);
|
||||
CREATE INDEX idx_logs_created_at ON logs(created_at);
|
||||
CREATE INDEX idx_logs_source ON logs(source);
|
||||
CREATE INDEX idx_logs_event_type ON logs(event_type);
|
||||
CREATE INDEX idx_logs_severity ON logs(severity);
|
||||
CREATE INDEX idx_logs_ccn_pubkey ON logs(ccn_pubkey);
|
||||
|
||||
CREATE TRIGGER cleanup_old_logs
|
||||
AFTER INSERT ON logs
|
||||
WHEN (SELECT COUNT(*) FROM logs) > 100000
|
||||
BEGIN
|
||||
DELETE FROM logs
|
||||
WHERE created_at < (unixepoch() - 2592000); -- 30 days
|
||||
END;
|
44
migrations/8-fixChunksTableSchema.sql
Normal file
44
migrations/8-fixChunksTableSchema.sql
Normal file
|
@ -0,0 +1,44 @@
|
|||
-- Fix event_chunks table schema to add proper security constraints for chunked message handling
|
||||
|
||||
-- Drop the old table if it exists
|
||||
DROP TABLE IF EXISTS event_chunks;
|
||||
|
||||
-- Create the event_chunks table with correct schema and security constraints
|
||||
CREATE TABLE event_chunks (
|
||||
chunk_id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||
message_id TEXT NOT NULL,
|
||||
chunk_index INTEGER NOT NULL,
|
||||
total_chunks INTEGER NOT NULL CHECK (total_chunks > 0 AND total_chunks <= 1000),
|
||||
content TEXT NOT NULL,
|
||||
created_at INTEGER NOT NULL,
|
||||
ccn_pubkey TEXT NOT NULL,
|
||||
|
||||
-- SECURITY: Prevent duplicate chunks and enforce data integrity
|
||||
UNIQUE(message_id, chunk_index, ccn_pubkey),
|
||||
|
||||
-- SECURITY: Ensure chunk_index is within valid bounds
|
||||
CHECK (chunk_index >= 0 AND chunk_index < total_chunks),
|
||||
|
||||
-- SECURITY: Limit message_id length to prevent DoS
|
||||
CHECK (length(message_id) <= 100),
|
||||
|
||||
-- SECURITY: Limit content size to prevent memory exhaustion
|
||||
CHECK (length(content) <= 65536),
|
||||
|
||||
-- SECURITY: Foreign key reference to ensure CCN exists
|
||||
FOREIGN KEY (ccn_pubkey) REFERENCES ccns(pubkey) ON DELETE CASCADE
|
||||
);
|
||||
|
||||
-- Indexes for performance
|
||||
CREATE INDEX idx_event_chunks_message_id ON event_chunks(message_id);
|
||||
CREATE INDEX idx_event_chunks_created_at ON event_chunks(created_at);
|
||||
CREATE INDEX idx_event_chunks_ccn_pubkey ON event_chunks(ccn_pubkey);
|
||||
|
||||
-- SECURITY: Automatic cleanup trigger for old chunks to prevent storage exhaustion
|
||||
CREATE TRIGGER cleanup_old_chunks
|
||||
AFTER INSERT ON event_chunks
|
||||
WHEN (SELECT COUNT(*) FROM event_chunks WHERE created_at < (unixepoch() - 86400)) > 0
|
||||
BEGIN
|
||||
DELETE FROM event_chunks
|
||||
WHERE created_at < (unixepoch() - 86400);
|
||||
END;
|
41
migrations/9-createOutboundEventQueue.sql
Normal file
41
migrations/9-createOutboundEventQueue.sql
Normal file
|
@ -0,0 +1,41 @@
|
|||
-- Create outbound event queue for offline event creation and reliable relay transmission
|
||||
-- This allows users to create events when offline and sync them when connectivity is restored
|
||||
|
||||
CREATE TABLE outbound_event_queue (
|
||||
queue_id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||
event_id TEXT NOT NULL,
|
||||
encrypted_event TEXT NOT NULL,
|
||||
ccn_pubkey TEXT NOT NULL,
|
||||
created_at INTEGER NOT NULL DEFAULT (unixepoch()),
|
||||
attempts INTEGER NOT NULL DEFAULT 0,
|
||||
last_attempt INTEGER NULL,
|
||||
status TEXT NOT NULL DEFAULT 'pending' CHECK (status IN ('pending', 'sending', 'sent', 'failed')),
|
||||
error_message TEXT NULL,
|
||||
|
||||
-- Ensure one queue entry per event
|
||||
UNIQUE(event_id),
|
||||
|
||||
-- Foreign key constraints
|
||||
FOREIGN KEY (ccn_pubkey) REFERENCES ccns(pubkey) ON DELETE CASCADE,
|
||||
FOREIGN KEY (event_id) REFERENCES events(id) ON DELETE CASCADE
|
||||
);
|
||||
|
||||
-- Indexes for efficient querying
|
||||
CREATE INDEX idx_outbound_queue_status ON outbound_event_queue(status);
|
||||
CREATE INDEX idx_outbound_queue_ccn_pubkey ON outbound_event_queue(ccn_pubkey);
|
||||
CREATE INDEX idx_outbound_queue_created_at ON outbound_event_queue(created_at);
|
||||
CREATE INDEX idx_outbound_queue_last_attempt ON outbound_event_queue(last_attempt);
|
||||
|
||||
-- Cleanup trigger for old completed/failed events
|
||||
CREATE TRIGGER cleanup_old_queue_entries
|
||||
AFTER UPDATE ON outbound_event_queue
|
||||
WHEN NEW.status IN ('sent', 'failed') AND NEW.attempts >= 5
|
||||
BEGIN
|
||||
-- Keep failed events for 30 days for debugging, sent events for 1 day
|
||||
DELETE FROM outbound_event_queue
|
||||
WHERE queue_id = NEW.queue_id
|
||||
AND (
|
||||
(status = 'sent' AND created_at < (unixepoch() - 86400)) OR
|
||||
(status = 'failed' AND created_at < (unixepoch() - 2592000))
|
||||
);
|
||||
END;
|
212
public/landing.html
Normal file
212
public/landing.html
Normal file
|
@ -0,0 +1,212 @@
|
|||
<!DOCTYPE html>
|
||||
<html lang="en">
|
||||
<head>
|
||||
<meta charset="UTF-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1.0">
|
||||
<title>Eve Relay - Secure Nostr Relay with CCN</title>
|
||||
<link rel="stylesheet" href="https://fonts.googleapis.com/css2?family=Inter:wght@400;500;600;700&display=swap">
|
||||
<style>
|
||||
:root {
|
||||
--primary: #3498db;
|
||||
--primary-dark: #2980b9;
|
||||
--text: #2c3e50;
|
||||
--text-light: #7f8c8d;
|
||||
--background: #ffffff;
|
||||
--background-alt: #f8f9fa;
|
||||
--border: #e9ecef;
|
||||
--success: #2ecc71;
|
||||
--warning: #f1c40f;
|
||||
}
|
||||
|
||||
* {
|
||||
margin: 0;
|
||||
padding: 0;
|
||||
box-sizing: border-box;
|
||||
}
|
||||
|
||||
body {
|
||||
font-family: 'Inter', -apple-system, BlinkMacSystemFont, sans-serif;
|
||||
line-height: 1.6;
|
||||
color: var(--text);
|
||||
background: var(--background);
|
||||
}
|
||||
|
||||
.container {
|
||||
max-width: 1200px;
|
||||
margin: 0 auto;
|
||||
padding: 2rem;
|
||||
}
|
||||
|
||||
header {
|
||||
text-align: center;
|
||||
padding: 4rem 0;
|
||||
background: var(--background-alt);
|
||||
border-bottom: 1px solid var(--border);
|
||||
}
|
||||
|
||||
.logo {
|
||||
width: 120px;
|
||||
height: 120px;
|
||||
margin-bottom: 1.5rem;
|
||||
background: var(--primary);
|
||||
border-radius: 20px;
|
||||
display: flex;
|
||||
align-items: center;
|
||||
justify-content: center;
|
||||
margin: 0 auto 1.5rem;
|
||||
}
|
||||
|
||||
.logo-text {
|
||||
font-size: 2.5rem;
|
||||
font-weight: 700;
|
||||
color: white;
|
||||
}
|
||||
|
||||
h1 {
|
||||
font-size: 2.5rem;
|
||||
font-weight: 700;
|
||||
margin-bottom: 1rem;
|
||||
color: var(--text);
|
||||
}
|
||||
|
||||
.subtitle {
|
||||
font-size: 1.25rem;
|
||||
color: var(--text-light);
|
||||
max-width: 600px;
|
||||
margin: 0 auto 2rem;
|
||||
}
|
||||
|
||||
h2 {
|
||||
font-size: 1.75rem;
|
||||
font-weight: 600;
|
||||
color: var(--text);
|
||||
margin: 2.5rem 0 1rem;
|
||||
}
|
||||
|
||||
h3 {
|
||||
font-size: 1.25rem;
|
||||
font-weight: 600;
|
||||
color: var(--text);
|
||||
margin: 1.5rem 0 0.75rem;
|
||||
}
|
||||
|
||||
code {
|
||||
background: var(--background-alt);
|
||||
padding: 0.2rem 0.4rem;
|
||||
border-radius: 4px;
|
||||
font-family: 'Courier New', monospace;
|
||||
font-size: 0.9em;
|
||||
}
|
||||
|
||||
pre {
|
||||
background: var(--background-alt);
|
||||
padding: 1.25rem;
|
||||
border-radius: 8px;
|
||||
overflow-x: auto;
|
||||
margin: 1rem 0;
|
||||
border: 1px solid var(--border);
|
||||
}
|
||||
|
||||
.note {
|
||||
background: #fffde7;
|
||||
border-left: 4px solid var(--warning);
|
||||
padding: 1.25rem;
|
||||
margin: 1.5rem 0;
|
||||
border-radius: 4px;
|
||||
}
|
||||
|
||||
.grid {
|
||||
display: grid;
|
||||
grid-template-columns: repeat(auto-fit, minmax(300px, 1fr));
|
||||
gap: 2rem;
|
||||
margin: 2rem 0;
|
||||
}
|
||||
|
||||
.card {
|
||||
background: var(--background);
|
||||
border: 1px solid var(--border);
|
||||
border-radius: 8px;
|
||||
padding: 1.5rem;
|
||||
transition: transform 0.2s, box-shadow 0.2s;
|
||||
}
|
||||
|
||||
.card:hover {
|
||||
transform: translateY(-2px);
|
||||
box-shadow: 0 4px 12px rgba(0,0,0,0.1);
|
||||
}
|
||||
|
||||
.command-list {
|
||||
list-style: none;
|
||||
}
|
||||
|
||||
.command-list li {
|
||||
margin-bottom: 0.75rem;
|
||||
padding: 0.75rem;
|
||||
background: var(--background-alt);
|
||||
border-radius: 4px;
|
||||
}
|
||||
|
||||
.command-list code {
|
||||
color: var(--primary);
|
||||
}
|
||||
|
||||
.cta-button {
|
||||
display: inline-block;
|
||||
background: var(--primary);
|
||||
color: white;
|
||||
padding: 0.75rem 1.5rem;
|
||||
border-radius: 6px;
|
||||
text-decoration: none;
|
||||
font-weight: 500;
|
||||
transition: background 0.2s;
|
||||
}
|
||||
|
||||
.cta-button:hover {
|
||||
background: var(--primary-dark);
|
||||
}
|
||||
</style>
|
||||
</head>
|
||||
<body>
|
||||
<header>
|
||||
<div class="logo">
|
||||
<span class="logo-text">E</span>
|
||||
</div>
|
||||
<h1>Eve Relay</h1>
|
||||
<p class="subtitle">A secure and efficient Nostr relay with Closed Community Network (CCN) functionality</p>
|
||||
</header>
|
||||
|
||||
<div class="container">
|
||||
<div class="note">
|
||||
<strong>Important:</strong> This relay is designed for WebSocket connections only. HTTP requests are not supported for data operations.
|
||||
</div>
|
||||
|
||||
<h2>Connection Details</h2>
|
||||
<p>Connect to the relay using WebSocket:</p>
|
||||
<pre>ws://localhost:6942</pre>
|
||||
|
||||
<div class="grid">
|
||||
<div class="card">
|
||||
<h3>Nostr Commands</h3>
|
||||
<ul class="command-list">
|
||||
<li><code>REQ</code> - Subscribe to events</li>
|
||||
<li><code>EVENT</code> - Publish an event</li>
|
||||
<li><code>CLOSE</code> - Close a subscription</li>
|
||||
</ul>
|
||||
</div>
|
||||
|
||||
<div class="card">
|
||||
<h3>CCN Commands</h3>
|
||||
<ul class="command-list">
|
||||
<li><code>CCN CREATE</code> - Create a new CCN</li>
|
||||
<li><code>CCN LIST</code> - List all active CCNs</li>
|
||||
<li><code>CCN ACTIVATE</code> - Activate a specific CCN</li>
|
||||
</ul>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<h2>Documentation</h2>
|
||||
<p>For detailed information about Arx-CCN functionality and best practices, please refer to the official documentation.</p>
|
||||
<a href="#" class="cta-button">View Documentation</a>
|
||||
</div>
|
||||
</body>
|
||||
</html>
|
61
src/UserConnection.ts
Normal file
61
src/UserConnection.ts
Normal file
|
@ -0,0 +1,61 @@
|
|||
import type { Database } from 'jsr:@db/sqlite';
|
||||
import type { NostrEvent, NostrFilter } from 'jsr:@nostrify/types';
|
||||
|
||||
export class UserConnection {
|
||||
public socket: WebSocket;
|
||||
public subscriptions: Map<string, NostrFilter[]>;
|
||||
public db: Database;
|
||||
|
||||
constructor(
|
||||
socket: WebSocket,
|
||||
subscriptions: Map<string, NostrFilter[]>,
|
||||
db: Database,
|
||||
) {
|
||||
this.socket = socket;
|
||||
this.subscriptions = subscriptions;
|
||||
this.db = db;
|
||||
}
|
||||
|
||||
/**
|
||||
* Sends a response to the client
|
||||
* @param responseArray The response array to send
|
||||
*/
|
||||
sendResponse(responseArray: unknown[]): void {
|
||||
this.socket.send(JSON.stringify(responseArray));
|
||||
}
|
||||
|
||||
/**
|
||||
* Sends a notice to the client
|
||||
* @param message The message to send
|
||||
*/
|
||||
sendNotice(message: string): void {
|
||||
this.sendResponse(['NOTICE', message]);
|
||||
}
|
||||
|
||||
/**
|
||||
* Sends an event to the client
|
||||
* @param subscriptionId The subscription ID
|
||||
* @param event The event to send
|
||||
*/
|
||||
sendEvent(subscriptionId: string, event: NostrEvent): void {
|
||||
this.sendResponse(['EVENT', subscriptionId, event]);
|
||||
}
|
||||
|
||||
/**
|
||||
* Sends an end of stored events message
|
||||
* @param subscriptionId The subscription ID
|
||||
*/
|
||||
sendEOSE(subscriptionId: string): void {
|
||||
this.sendResponse(['EOSE', subscriptionId]);
|
||||
}
|
||||
|
||||
/**
|
||||
* Sends an OK response
|
||||
* @param eventId The event ID
|
||||
* @param success Whether the operation was successful
|
||||
* @param message The message to send
|
||||
*/
|
||||
sendOK(eventId: string, success: boolean, message: string): void {
|
||||
this.sendResponse(['OK', eventId, success, message]);
|
||||
}
|
||||
}
|
314
src/commands/ccn.ts
Normal file
314
src/commands/ccn.ts
Normal file
|
@ -0,0 +1,314 @@
|
|||
import type { Database } from 'jsr:@db/sqlite';
|
||||
import { encodeBase64 } from 'jsr:@std/encoding@0.224/base64';
|
||||
import { hexToBytes } from '@noble/ciphers/utils';
|
||||
import * as nostrTools from '@nostr/tools';
|
||||
import type { UserConnection } from '../UserConnection.ts';
|
||||
import { handleSocketError } from '../index.ts';
|
||||
import { createNewCCN } from '../utils/createNewCCN.ts';
|
||||
import { encryptUint8Array, encryptionKey } from '../utils/encryption.ts';
|
||||
import { getEveFilePath } from '../utils/files.ts';
|
||||
import { getAllCCNs } from '../utils/getAllCCNs.ts';
|
||||
import { log } from '../utils/logs.ts';
|
||||
import { sql } from '../utils/queries.ts';
|
||||
import {
|
||||
SecurityEventType,
|
||||
SecuritySeverity,
|
||||
logAuthEvent,
|
||||
logSecurityEvent,
|
||||
} from '../utils/securityLogs.ts';
|
||||
|
||||
function activateCCN(database: Database, pubkey: string): void {
|
||||
sql`UPDATE ccns SET is_active = 0`(database);
|
||||
sql`UPDATE ccns SET is_active = 1 WHERE pubkey = ${pubkey}`(database);
|
||||
|
||||
logAuthEvent(SecurityEventType.CCN_ACTIVATION_ATTEMPT, true, {
|
||||
ccn_pubkey: pubkey,
|
||||
});
|
||||
}
|
||||
|
||||
async function handleCreateCCN(
|
||||
connection: UserConnection,
|
||||
data: { name: string; seed?: string; creator: string },
|
||||
): Promise<void> {
|
||||
log.debug('start', { tag: 'handleCreateCCN', data });
|
||||
|
||||
logSecurityEvent({
|
||||
eventType: SecurityEventType.CCN_CREATION_ATTEMPT,
|
||||
severity: SecuritySeverity.MEDIUM,
|
||||
source: 'ccn_management',
|
||||
details: {
|
||||
ccn_name: data.name,
|
||||
creator: data.creator,
|
||||
has_seed: !!data.seed,
|
||||
},
|
||||
});
|
||||
|
||||
try {
|
||||
if (!data.name || typeof data.name !== 'string') {
|
||||
logSecurityEvent({
|
||||
eventType: SecurityEventType.CCN_CREATION_ATTEMPT,
|
||||
severity: SecuritySeverity.MEDIUM,
|
||||
source: 'ccn_management',
|
||||
details: {
|
||||
error: 'invalid_name',
|
||||
name_provided: !!data.name,
|
||||
name_type: typeof data.name,
|
||||
},
|
||||
});
|
||||
|
||||
connection.sendNotice('Name is required');
|
||||
return;
|
||||
}
|
||||
|
||||
if (!data.creator || typeof data.creator !== 'string') {
|
||||
logSecurityEvent({
|
||||
eventType: SecurityEventType.CCN_CREATION_ATTEMPT,
|
||||
severity: SecuritySeverity.MEDIUM,
|
||||
source: 'ccn_management',
|
||||
details: {
|
||||
error: 'invalid_creator',
|
||||
creator_provided: !!data.creator,
|
||||
creator_type: typeof data.creator,
|
||||
},
|
||||
});
|
||||
|
||||
connection.sendNotice('Creator is required');
|
||||
return;
|
||||
}
|
||||
|
||||
const newCcn = await createNewCCN(
|
||||
connection.db,
|
||||
data.name,
|
||||
data.creator,
|
||||
data.seed,
|
||||
);
|
||||
log.debug('created new CCN', {
|
||||
tag: 'handleCreateCCN',
|
||||
pubkey: newCcn.pubkey,
|
||||
});
|
||||
activateCCN(connection.db, newCcn.pubkey);
|
||||
log.debug('activated new CCN', {
|
||||
tag: 'handleCreateCCN',
|
||||
pubkey: newCcn.pubkey,
|
||||
});
|
||||
|
||||
logSecurityEvent({
|
||||
eventType: SecurityEventType.CCN_CREATION_ATTEMPT,
|
||||
severity: SecuritySeverity.LOW,
|
||||
source: 'ccn_management',
|
||||
details: {
|
||||
success: true,
|
||||
ccn_pubkey: newCcn.pubkey,
|
||||
ccn_name: data.name,
|
||||
creator: data.creator,
|
||||
},
|
||||
});
|
||||
|
||||
connection.sendResponse([
|
||||
'OK',
|
||||
'CCN CREATED',
|
||||
true,
|
||||
JSON.stringify({
|
||||
pubkey: newCcn.pubkey,
|
||||
name: data.name,
|
||||
}),
|
||||
]);
|
||||
|
||||
log.info('CCN created', data);
|
||||
} catch (error: unknown) {
|
||||
log.error('error', { tag: 'handleCreateCCN', error });
|
||||
|
||||
logSecurityEvent({
|
||||
eventType: SecurityEventType.CCN_CREATION_ATTEMPT,
|
||||
severity: SecuritySeverity.HIGH,
|
||||
source: 'ccn_management',
|
||||
details: {
|
||||
success: false,
|
||||
error_message: error instanceof Error ? error.message : 'Unknown error',
|
||||
ccn_name: data.name,
|
||||
creator: data.creator,
|
||||
},
|
||||
});
|
||||
|
||||
handleSocketError(connection, 'create CCN', error);
|
||||
}
|
||||
log.debug('end', { tag: 'handleCreateCCN' });
|
||||
}
|
||||
|
||||
function handleGetCCNs(connection: UserConnection): void {
|
||||
try {
|
||||
const ccns = getAllCCNs(connection.db);
|
||||
connection.sendResponse(['OK', 'CCN LIST', true, JSON.stringify(ccns)]);
|
||||
} catch (error: unknown) {
|
||||
handleSocketError(connection, 'get CCNs', error);
|
||||
}
|
||||
}
|
||||
|
||||
function handleActivateCCN(
|
||||
connection: UserConnection,
|
||||
data: { pubkey: string },
|
||||
): void {
|
||||
log.debug('start', { tag: 'handleActivateCCN', data });
|
||||
try {
|
||||
if (!data.pubkey || typeof data.pubkey !== 'string') {
|
||||
connection.sendNotice('CCN pubkey is required');
|
||||
return;
|
||||
}
|
||||
|
||||
const ccnExists = sql`
|
||||
SELECT COUNT(*) as count FROM ccns WHERE pubkey = ${data.pubkey}
|
||||
`(connection.db)[0].count;
|
||||
|
||||
if (ccnExists === 0) {
|
||||
connection.sendNotice('CCN not found');
|
||||
log.debug('CCN not found', {
|
||||
tag: 'handleActivateCCN',
|
||||
pubkey: data.pubkey,
|
||||
});
|
||||
return;
|
||||
}
|
||||
|
||||
for (const subscriptionId of connection.subscriptions.keys()) {
|
||||
connection.sendResponse([
|
||||
'CLOSED',
|
||||
subscriptionId,
|
||||
'Subscription closed due to CCN activation',
|
||||
]);
|
||||
log.debug('closed subscription', {
|
||||
tag: 'handleActivateCCN',
|
||||
subscriptionId,
|
||||
});
|
||||
}
|
||||
|
||||
connection.subscriptions.clear();
|
||||
log.info('All subscriptions cleared due to CCN activation', {});
|
||||
|
||||
activateCCN(connection.db, data.pubkey);
|
||||
log.debug('activated CCN', {
|
||||
tag: 'handleActivateCCN',
|
||||
pubkey: data.pubkey,
|
||||
});
|
||||
|
||||
const activatedCCN = sql`
|
||||
SELECT pubkey, name FROM ccns WHERE pubkey = ${data.pubkey}
|
||||
`(connection.db)[0];
|
||||
|
||||
connection.sendResponse([
|
||||
'OK',
|
||||
'CCN ACTIVATED',
|
||||
true,
|
||||
JSON.stringify(activatedCCN),
|
||||
]);
|
||||
|
||||
log.info(`CCN activated: ${activatedCCN.name}`, {});
|
||||
} catch (error: unknown) {
|
||||
log.error('error', { tag: 'handleActivateCCN', error });
|
||||
handleSocketError(connection, 'activate CCN', error);
|
||||
}
|
||||
log.debug('end', { tag: 'handleActivateCCN' });
|
||||
}
|
||||
|
||||
async function handleAddCCN(
|
||||
connection: UserConnection,
|
||||
data: { name: string; allowedPubkeys: string[]; privateKey: string },
|
||||
): Promise<void> {
|
||||
log.debug('start', { tag: 'handleAddCCN', data });
|
||||
try {
|
||||
if (!data.privateKey || typeof data.privateKey !== 'string') {
|
||||
connection.sendNotice('CCN private key is required');
|
||||
return;
|
||||
}
|
||||
|
||||
const privateKeyBytes = hexToBytes(data.privateKey);
|
||||
const pubkey = nostrTools.getPublicKey(privateKeyBytes);
|
||||
log.debug('derived pubkey', { tag: 'handleAddCCN', pubkey });
|
||||
|
||||
const ccnExists = sql`
|
||||
SELECT COUNT(*) as count FROM ccns WHERE pubkey = ${pubkey}
|
||||
`(connection.db)[0].count;
|
||||
|
||||
if (ccnExists > 0) {
|
||||
connection.sendNotice('CCN already exists');
|
||||
log.debug('CCN already exists', {
|
||||
tag: 'handleAddCCN',
|
||||
pubkey,
|
||||
});
|
||||
return;
|
||||
}
|
||||
|
||||
const ccnPublicKey = nostrTools.getPublicKey(privateKeyBytes);
|
||||
const ccnPrivPath = await getEveFilePath(`ccn_keys/${ccnPublicKey}`);
|
||||
const encryptedPrivateKey = encryptUint8Array(
|
||||
privateKeyBytes,
|
||||
encryptionKey,
|
||||
);
|
||||
Deno.writeTextFileSync(ccnPrivPath, encodeBase64(encryptedPrivateKey));
|
||||
|
||||
connection.db.run('BEGIN TRANSACTION');
|
||||
log.debug('begin transaction', { tag: 'handleAddCCN' });
|
||||
|
||||
sql`INSERT INTO ccns (pubkey, name) VALUES (${ccnPublicKey}, ${data.name})`(
|
||||
connection.db,
|
||||
);
|
||||
for (const allowedPubkey of data.allowedPubkeys)
|
||||
sql`INSERT INTO allowed_writes (ccn_pubkey, pubkey) VALUES (${ccnPublicKey}, ${allowedPubkey})`(
|
||||
connection.db,
|
||||
);
|
||||
|
||||
connection.db.run('COMMIT TRANSACTION');
|
||||
log.debug('committed transaction', { tag: 'handleAddCCN' });
|
||||
activateCCN(connection.db, ccnPublicKey);
|
||||
log.debug('activated CCN', {
|
||||
tag: 'handleAddCCN',
|
||||
pubkey: ccnPublicKey,
|
||||
});
|
||||
|
||||
connection.sendResponse([
|
||||
'OK',
|
||||
'CCN ADDED',
|
||||
true,
|
||||
JSON.stringify({
|
||||
pubkey: ccnPublicKey,
|
||||
name: 'New CCN',
|
||||
}),
|
||||
]);
|
||||
} catch (error: unknown) {
|
||||
log.error('error', { tag: 'handleAddCCN', error });
|
||||
handleSocketError(connection, 'ADD CCN', error);
|
||||
}
|
||||
log.debug('end', { tag: 'handleAddCCN' });
|
||||
}
|
||||
|
||||
function handleCCNCommands(
|
||||
connection: UserConnection,
|
||||
command: string,
|
||||
data: unknown,
|
||||
) {
|
||||
switch (command) {
|
||||
case 'CREATE':
|
||||
return handleCreateCCN(
|
||||
connection,
|
||||
data as { name: string; seed?: string; creator: string },
|
||||
);
|
||||
case 'ADD':
|
||||
return handleAddCCN(
|
||||
connection,
|
||||
data as { name: string; allowedPubkeys: string[]; privateKey: string },
|
||||
);
|
||||
case 'LIST':
|
||||
return handleGetCCNs(connection);
|
||||
case 'ACTIVATE':
|
||||
return handleActivateCCN(connection, data as { pubkey: string });
|
||||
default:
|
||||
return log.warn('Invalid CCN command', {});
|
||||
}
|
||||
}
|
||||
|
||||
export {
|
||||
activateCCN,
|
||||
handleActivateCCN,
|
||||
handleAddCCN,
|
||||
handleCCNCommands,
|
||||
handleCreateCCN,
|
||||
handleGetCCNs,
|
||||
};
|
15
src/commands/close.ts
Normal file
15
src/commands/close.ts
Normal file
|
@ -0,0 +1,15 @@
|
|||
import type { UserConnection } from '../UserConnection.ts';
|
||||
import { log } from '../utils/logs.ts';
|
||||
|
||||
export function handleClose(
|
||||
connection: UserConnection,
|
||||
subscriptionId: string,
|
||||
) {
|
||||
if (!connection.subscriptions.has(subscriptionId)) {
|
||||
return log.warn(
|
||||
`Closing unknown subscription? That's weird. Subscription ID: ${subscriptionId}`,
|
||||
);
|
||||
}
|
||||
|
||||
connection.subscriptions.delete(subscriptionId);
|
||||
}
|
85
src/commands/event.ts
Normal file
85
src/commands/event.ts
Normal file
|
@ -0,0 +1,85 @@
|
|||
import * as nostrTools from '@nostr/tools';
|
||||
import type { UserConnection } from '../UserConnection.ts';
|
||||
import { addEventToDb } from '../dbEvents/addEventToDb.ts';
|
||||
import {
|
||||
EventAlreadyExistsException,
|
||||
createEncryptedEvent,
|
||||
} from '../eventEncryptionDecryption.ts';
|
||||
import { filtersMatchingEvent } from '../utils/filtersMatchingEvent.ts';
|
||||
import { getActiveCCN } from '../utils/getActiveCCN.ts';
|
||||
import { isArray } from '../utils/isArray.ts';
|
||||
import { log } from '../utils/logs.ts';
|
||||
import { queueEventForTransmission } from '../utils/outboundQueue.ts';
|
||||
|
||||
export async function handleEvent(
|
||||
connection: UserConnection,
|
||||
event: nostrTools.Event,
|
||||
) {
|
||||
log.debug('start', { tag: 'handleEvent', eventId: event.id });
|
||||
const valid = nostrTools.verifyEvent(event);
|
||||
if (!valid) {
|
||||
connection.sendNotice('Invalid event');
|
||||
return log.warn('Invalid event', { tag: 'handleEvent' });
|
||||
}
|
||||
|
||||
const activeCCN = getActiveCCN(connection.db);
|
||||
if (!activeCCN) {
|
||||
connection.sendNotice('No active CCN found');
|
||||
return log.warn('No active CCN found', { tag: 'handleEvent' });
|
||||
}
|
||||
|
||||
const encryptedEvent = await createEncryptedEvent(event, connection.db);
|
||||
try {
|
||||
if (isArray(encryptedEvent)) {
|
||||
log.debug('adding chunked event to database', {
|
||||
tag: 'handleEvent',
|
||||
});
|
||||
addEventToDb(connection.db, event, encryptedEvent[0], activeCCN.pubkey);
|
||||
} else {
|
||||
addEventToDb(connection.db, event, encryptedEvent, activeCCN.pubkey);
|
||||
}
|
||||
|
||||
queueEventForTransmission(
|
||||
connection.db,
|
||||
event.id,
|
||||
encryptedEvent,
|
||||
activeCCN.pubkey,
|
||||
);
|
||||
|
||||
log.debug('event queued for transmission', {
|
||||
tag: 'handleEvent',
|
||||
eventId: event.id,
|
||||
});
|
||||
} catch (e) {
|
||||
if (e instanceof EventAlreadyExistsException) {
|
||||
log.warn('Event already exists');
|
||||
return;
|
||||
}
|
||||
if (e instanceof Error)
|
||||
log.error('error adding event', {
|
||||
tag: 'handleEvent',
|
||||
error: e.stack,
|
||||
});
|
||||
else
|
||||
log.error('error adding event', {
|
||||
tag: 'handleEvent',
|
||||
error: String(e),
|
||||
});
|
||||
}
|
||||
|
||||
connection.sendOK(event.id, true, 'Event added');
|
||||
log.debug('sent OK', { tag: 'handleEvent', eventId: event.id });
|
||||
|
||||
const filtersThatMatchEvent = filtersMatchingEvent(event, connection);
|
||||
|
||||
for (let i = 0; i < filtersThatMatchEvent.length; i++) {
|
||||
const filter = filtersThatMatchEvent[i];
|
||||
connection.sendEvent(filter, event);
|
||||
log.debug('sent event to filter', {
|
||||
tag: 'handleEvent',
|
||||
filter,
|
||||
eventId: event.id,
|
||||
});
|
||||
}
|
||||
log.debug('end', { tag: 'handleEvent', eventId: event.id });
|
||||
}
|
291
src/commands/request.ts
Normal file
291
src/commands/request.ts
Normal file
|
@ -0,0 +1,291 @@
|
|||
import type { NostrClientREQ } from 'jsr:@nostrify/types';
|
||||
import type { UserConnection } from '../UserConnection.ts';
|
||||
import { isCCNReplaceableEvent } from '../utils/eventTypes.ts';
|
||||
import { getActiveCCN } from '../utils/getActiveCCN.ts';
|
||||
import { log } from '../utils/logs.ts';
|
||||
import { parseATagQuery } from '../utils/parseATagQuery.ts';
|
||||
import { mixQuery, sql, sqlPartial } from '../utils/queries.ts';
|
||||
|
||||
export function handleRequest(
|
||||
connection: UserConnection,
|
||||
request: NostrClientREQ,
|
||||
) {
|
||||
log.debug('start', { tag: 'handleRequest', request });
|
||||
const [, subscriptionId, ...filters] = request;
|
||||
if (connection.subscriptions.has(subscriptionId)) {
|
||||
return log.warn('Duplicate subscription ID', {
|
||||
tag: 'handleRequest',
|
||||
});
|
||||
}
|
||||
|
||||
log.info(
|
||||
`New subscription: ${subscriptionId} with filters: ${JSON.stringify(
|
||||
filters,
|
||||
)}`,
|
||||
);
|
||||
|
||||
const activeCCN = getActiveCCN(connection.db);
|
||||
if (!activeCCN) {
|
||||
connection.sendNotice('No active CCN found');
|
||||
return log.warn('No active CCN found', { tag: 'handleRequest' });
|
||||
}
|
||||
|
||||
let query = sqlPartial`SELECT * FROM events WHERE replaced = 0 AND deleted = 0 AND ccn_pubkey = ${activeCCN.pubkey}`;
|
||||
|
||||
let minLimit: number | null = null;
|
||||
for (const filter of filters) {
|
||||
if (filter.limit && filter.limit > 0) {
|
||||
minLimit =
|
||||
minLimit === null ? filter.limit : Math.min(minLimit, filter.limit);
|
||||
}
|
||||
}
|
||||
|
||||
const filtersAreNotEmpty = filters.some((filter) => {
|
||||
return Object.values(filter).some((value) => {
|
||||
return value.length > 0;
|
||||
});
|
||||
});
|
||||
|
||||
if (filtersAreNotEmpty) {
|
||||
query = mixQuery(query, sqlPartial`AND`);
|
||||
|
||||
for (let i = 0; i < filters.length; i++) {
|
||||
// filters act as OR, filter groups act as AND
|
||||
query = mixQuery(query, sqlPartial`(`);
|
||||
|
||||
const filter = Object.entries(filters[i]).filter(([type, value]) => {
|
||||
if (type === 'ids') return value.length > 0;
|
||||
if (type === 'authors') return value.length > 0;
|
||||
if (type === 'kinds') return value.length > 0;
|
||||
if (type.startsWith('#')) return value.length > 0;
|
||||
if (type === 'since') return value > 0;
|
||||
if (type === 'until') return value > 0;
|
||||
if (type === 'limit') return value > 0;
|
||||
return false;
|
||||
});
|
||||
|
||||
const filterWithoutLimit = filter.filter(([type]) => type !== 'limit');
|
||||
|
||||
for (let j = 0; j < filter.length; j++) {
|
||||
const [type, value] = filter[j];
|
||||
|
||||
if (type === 'ids') {
|
||||
const uniqueIds = [...new Set(value)];
|
||||
query = mixQuery(query, sqlPartial`(`);
|
||||
for (let k = 0; k < uniqueIds.length; k++) {
|
||||
const id = uniqueIds[k] as string;
|
||||
|
||||
query = mixQuery(query, sqlPartial`(id = ${id})`);
|
||||
|
||||
if (k < uniqueIds.length - 1) {
|
||||
query = mixQuery(query, sqlPartial`OR`);
|
||||
}
|
||||
}
|
||||
query = mixQuery(query, sqlPartial`)`);
|
||||
}
|
||||
|
||||
if (type === 'authors') {
|
||||
const uniqueAuthors = [...new Set(value)];
|
||||
query = mixQuery(query, sqlPartial`(`);
|
||||
for (let k = 0; k < uniqueAuthors.length; k++) {
|
||||
const author = uniqueAuthors[k] as string;
|
||||
|
||||
query = mixQuery(query, sqlPartial`(pubkey = ${author})`);
|
||||
|
||||
if (k < uniqueAuthors.length - 1) {
|
||||
query = mixQuery(query, sqlPartial`OR`);
|
||||
}
|
||||
}
|
||||
query = mixQuery(query, sqlPartial`)`);
|
||||
}
|
||||
|
||||
if (type === 'kinds') {
|
||||
const uniqueKinds = [...new Set(value)];
|
||||
query = mixQuery(query, sqlPartial`(`);
|
||||
for (let k = 0; k < uniqueKinds.length; k++) {
|
||||
const kind = uniqueKinds[k] as number;
|
||||
|
||||
query = mixQuery(query, sqlPartial`(kind = ${kind})`);
|
||||
|
||||
if (k < uniqueKinds.length - 1) {
|
||||
query = mixQuery(query, sqlPartial`OR`);
|
||||
}
|
||||
}
|
||||
query = mixQuery(query, sqlPartial`)`);
|
||||
}
|
||||
|
||||
if (type.startsWith('#')) {
|
||||
const tag = type.slice(1);
|
||||
const uniqueValues = [...new Set(value)];
|
||||
query = mixQuery(query, sqlPartial`(`);
|
||||
for (let k = 0; k < uniqueValues.length; k++) {
|
||||
const tagValue = uniqueValues[k] as string;
|
||||
if (tag === 'a') {
|
||||
const aTagInfo = parseATagQuery(tagValue);
|
||||
|
||||
if (aTagInfo.dTag && aTagInfo.dTag !== '') {
|
||||
if (isCCNReplaceableEvent(aTagInfo.kind)) {
|
||||
// CCN replaceable event reference
|
||||
query = mixQuery(
|
||||
query,
|
||||
sqlPartial`id IN (
|
||||
SELECT e.id
|
||||
FROM events e
|
||||
JOIN event_tags t ON e.id = t.event_id
|
||||
JOIN event_tags_values v ON t.tag_id = v.tag_id
|
||||
WHERE e.kind = ${aTagInfo.kind}
|
||||
AND t.tag_name = 'd'
|
||||
AND v.value_position = 1
|
||||
AND v.value = ${aTagInfo.dTag}
|
||||
)`,
|
||||
);
|
||||
} else {
|
||||
// Addressable event reference
|
||||
query = mixQuery(
|
||||
query,
|
||||
sqlPartial`id IN (
|
||||
SELECT e.id
|
||||
FROM events e
|
||||
JOIN event_tags t ON e.id = t.event_id
|
||||
JOIN event_tags_values v ON t.tag_id = v.tag_id
|
||||
WHERE e.kind = ${aTagInfo.kind}
|
||||
AND e.pubkey = ${aTagInfo.pubkey}
|
||||
AND t.tag_name = 'd'
|
||||
AND v.value_position = 1
|
||||
AND v.value = ${aTagInfo.dTag}
|
||||
)`,
|
||||
);
|
||||
}
|
||||
} else {
|
||||
// Replaceable event reference
|
||||
query = mixQuery(
|
||||
query,
|
||||
sqlPartial`id IN (
|
||||
SELECT id
|
||||
FROM events
|
||||
WHERE kind = ${aTagInfo.kind}
|
||||
AND pubkey = ${aTagInfo.pubkey}
|
||||
)`,
|
||||
);
|
||||
}
|
||||
} else {
|
||||
// Regular tag handling (unchanged)
|
||||
query = mixQuery(
|
||||
query,
|
||||
sqlPartial`id IN (
|
||||
SELECT t.event_id
|
||||
FROM event_tags t
|
||||
WHERE t.tag_name = ${tag}
|
||||
AND t.tag_id IN (
|
||||
SELECT v.tag_id
|
||||
FROM event_tags_values v
|
||||
WHERE v.value_position = 1
|
||||
AND v.value = ${tagValue}
|
||||
)
|
||||
)`,
|
||||
);
|
||||
}
|
||||
|
||||
if (k < uniqueValues.length - 1) {
|
||||
query = mixQuery(query, sqlPartial`OR`);
|
||||
}
|
||||
}
|
||||
query = mixQuery(query, sqlPartial`)`);
|
||||
}
|
||||
|
||||
if (type === 'since') {
|
||||
query = mixQuery(query, sqlPartial`created_at > ${value}`);
|
||||
}
|
||||
|
||||
if (type === 'until') {
|
||||
query = mixQuery(query, sqlPartial`created_at <= ${value}`);
|
||||
}
|
||||
|
||||
if (j < filterWithoutLimit.length - 1)
|
||||
query = mixQuery(query, sqlPartial`AND`);
|
||||
}
|
||||
|
||||
query = mixQuery(query, sqlPartial`)`);
|
||||
|
||||
if (i < filters.length - 1) query = mixQuery(query, sqlPartial`OR`);
|
||||
}
|
||||
}
|
||||
|
||||
query = mixQuery(query, sqlPartial`ORDER BY created_at ASC`);
|
||||
|
||||
if (minLimit !== null) {
|
||||
query = mixQuery(query, sqlPartial`LIMIT ${minLimit}`);
|
||||
}
|
||||
|
||||
log.debug('built query', {
|
||||
tag: 'handleRequest',
|
||||
query: query.query,
|
||||
values: query.values,
|
||||
});
|
||||
|
||||
const events = connection.db.prepare(query.query).all(...query.values);
|
||||
log.debug('found events', {
|
||||
tag: 'handleRequest',
|
||||
count: events.length,
|
||||
});
|
||||
|
||||
for (let i = 0; i < events.length; i++) {
|
||||
const rawTags = sql`SELECT * FROM event_tags_view WHERE event_id = ${
|
||||
events[i].id
|
||||
}`(connection.db);
|
||||
const tagsByIndex = new Map<
|
||||
number,
|
||||
{
|
||||
name: string;
|
||||
values: Map<number, string>;
|
||||
}
|
||||
>();
|
||||
|
||||
for (const tag of rawTags) {
|
||||
let tagData = tagsByIndex.get(tag.tag_index);
|
||||
if (!tagData) {
|
||||
tagData = {
|
||||
name: tag.tag_name,
|
||||
values: new Map(),
|
||||
};
|
||||
tagsByIndex.set(tag.tag_index, tagData);
|
||||
}
|
||||
|
||||
tagData.values.set(tag.tag_value_position, tag.tag_value);
|
||||
}
|
||||
|
||||
const tagsArray = Array.from(tagsByIndex.entries())
|
||||
.sort(([indexA], [indexB]) => indexA - indexB)
|
||||
.map(([_, tagData]) => {
|
||||
const { name, values } = tagData;
|
||||
|
||||
return [
|
||||
name,
|
||||
...Array.from(values.entries())
|
||||
.sort(([posA], [posB]) => posA - posB)
|
||||
.map(([_, value]) => value),
|
||||
];
|
||||
});
|
||||
|
||||
const event = {
|
||||
id: events[i].id,
|
||||
pubkey: events[i].pubkey,
|
||||
created_at: events[i].created_at,
|
||||
kind: events[i].kind,
|
||||
tags: tagsArray,
|
||||
content: events[i].content,
|
||||
sig: events[i].sig,
|
||||
};
|
||||
|
||||
connection.sendEvent(subscriptionId, event);
|
||||
log.debug('sent event', {
|
||||
tag: 'handleRequest',
|
||||
subscriptionId,
|
||||
eventId: event.id,
|
||||
});
|
||||
}
|
||||
connection.sendEOSE(subscriptionId);
|
||||
log.debug('sent EOSE', { tag: 'handleRequest', subscriptionId });
|
||||
connection.subscriptions.set(subscriptionId, filters);
|
||||
log.debug('end', { tag: 'handleRequest', subscriptionId });
|
||||
}
|
59
src/consts.ts
Normal file
59
src/consts.ts
Normal file
|
@ -0,0 +1,59 @@
|
|||
/**
|
||||
* Minimum required Proof of Work (PoW) difficulty for note acceptance.
|
||||
*
|
||||
* Notes with PoW below this threshold will be rejected without decryption attempts.
|
||||
* This threshold serves as a DoS protection mechanism for the CCN in case of
|
||||
* public key compromise.
|
||||
*/
|
||||
export const MIN_POW = 8;
|
||||
|
||||
/**
|
||||
* Target Proof of Work (PoW) difficulty for relay-generated notes.
|
||||
*
|
||||
* Defines the PoW difficulty level that the relay will compute when generating
|
||||
* and encrypting its own notes before broadcasting them to the network.
|
||||
*
|
||||
* Expected Performance on modern hardware (2025):
|
||||
* - Difficulty 8: ~1ms
|
||||
* - Difficulty 21: ~5-6 seconds
|
||||
*/
|
||||
export const POW_TO_MINE = 10;
|
||||
|
||||
/**
|
||||
* Maximum size of a note chunk in bytes.
|
||||
*
|
||||
* This value determines the maximum size of a note that can be encrypted and
|
||||
* sent in a single chunk.
|
||||
*/
|
||||
export const MAX_CHUNK_SIZE = 32768;
|
||||
|
||||
/**
|
||||
* Interval for cleaning up expired note chunks in milliseconds.
|
||||
*
|
||||
* This value determines how often the relay will check for and remove expired
|
||||
* note chunks from the database.
|
||||
*/
|
||||
export const CHUNK_CLEANUP_INTERVAL = 1000 * 60 * 60;
|
||||
|
||||
/**
|
||||
* Maximum age of a note chunk in milliseconds.
|
||||
*
|
||||
* This value determines the maximum duration a note chunk can remain in the
|
||||
* database before it is considered expired and eligible for cleanup.
|
||||
*/
|
||||
export const CHUNK_MAX_AGE = 1000 * 60 * 60 * 24;
|
||||
|
||||
/**
|
||||
* Interval for processing the outbound event queue in milliseconds.
|
||||
*
|
||||
* This determines how often the relay will attempt to send pending events
|
||||
* to external relays.
|
||||
*/
|
||||
export const QUEUE_PROCESS_INTERVAL = 10000;
|
||||
|
||||
/**
|
||||
* Maximum number of transmission attempts for outbound events.
|
||||
*
|
||||
* Events that fail to transmit this many times will be marked as permanently failed.
|
||||
*/
|
||||
export const MAX_TRANSMISSION_ATTEMPTS = 5;
|
331
src/dbEvents/addEventToDb.ts
Normal file
331
src/dbEvents/addEventToDb.ts
Normal file
|
@ -0,0 +1,331 @@
|
|||
import { bytesToHex } from '@noble/ciphers/utils';
|
||||
import { sha512 } from '@noble/hashes/sha2';
|
||||
import * as nostrTools from '@nostr/tools';
|
||||
import { base64 } from '@scure/base';
|
||||
import type { Database } from 'jsr:@db/sqlite';
|
||||
import { POW_TO_MINE } from '../consts.ts';
|
||||
import { handleDeletionEvent } from '../dbEvents/deletionEvent.ts';
|
||||
import {
|
||||
EventAlreadyExistsException,
|
||||
createEncryptedEventForPubkey,
|
||||
} from '../eventEncryptionDecryption.ts';
|
||||
import { publishToRelays } from '../relays.ts';
|
||||
import {
|
||||
isAddressableEvent,
|
||||
isCCNReplaceableEvent,
|
||||
isDeleteEvent,
|
||||
isReplaceableEvent,
|
||||
} from '../utils/eventTypes.ts';
|
||||
import { getCCNPrivateKeyByPubkey } from '../utils/getCCNPrivateKeyByPubkey.ts';
|
||||
import { log } from '../utils/logs.ts';
|
||||
import { sql } from '../utils/queries.ts';
|
||||
import {
|
||||
SecurityEventType,
|
||||
SecuritySeverity,
|
||||
logCCNViolation,
|
||||
logSecurityEvent,
|
||||
} from '../utils/securityLogs.ts';
|
||||
|
||||
export function addEventToDb(
|
||||
db: Database,
|
||||
decryptedEvent: nostrTools.VerifiedEvent,
|
||||
encryptedEvent: nostrTools.VerifiedEvent,
|
||||
ccnPubkey: string,
|
||||
) {
|
||||
log.debug('start', {
|
||||
tag: 'addEventToDb',
|
||||
decryptedId: decryptedEvent.id,
|
||||
encryptedId: encryptedEvent.id,
|
||||
kind: decryptedEvent.kind,
|
||||
ccnPubkey,
|
||||
});
|
||||
const existingEvent = sql`
|
||||
SELECT * FROM events WHERE id = ${decryptedEvent.id}
|
||||
`(db)[0];
|
||||
|
||||
if (existingEvent) throw new EventAlreadyExistsException();
|
||||
|
||||
if (isDeleteEvent(decryptedEvent.kind)) {
|
||||
log.debug('isDeleteEvent, delegating to handleDeletionEvent', {
|
||||
tag: 'addEventToDb',
|
||||
decryptId: decryptedEvent.id,
|
||||
});
|
||||
handleDeletionEvent(db, decryptedEvent, encryptedEvent, ccnPubkey);
|
||||
return;
|
||||
}
|
||||
|
||||
const isInvite =
|
||||
decryptedEvent.tags.findIndex(
|
||||
(tag: string[]) => tag[0] === 'type' && tag[1] === 'invite',
|
||||
) !== -1;
|
||||
|
||||
if (isInvite) {
|
||||
log.debug('isInvite event', { tag: 'addEventToDb' });
|
||||
const shadContent = bytesToHex(
|
||||
sha512.create().update(decryptedEvent.content).digest(),
|
||||
);
|
||||
|
||||
const inviteUsed = sql`
|
||||
SELECT COUNT(*) as count FROM inviter_invitee WHERE invite_hash = ${shadContent}
|
||||
`(db)[0].count;
|
||||
|
||||
if (inviteUsed > 0) {
|
||||
log.debug('invite already used', { tag: 'addEventToDb' });
|
||||
|
||||
logSecurityEvent({
|
||||
eventType: SecurityEventType.INVITE_ALREADY_USED,
|
||||
severity: SecuritySeverity.HIGH,
|
||||
source: 'invite_processing',
|
||||
details: {
|
||||
invite_hash: shadContent,
|
||||
event_id: decryptedEvent.id,
|
||||
ccn_pubkey: ccnPubkey,
|
||||
invitee_pubkey: decryptedEvent.pubkey,
|
||||
},
|
||||
});
|
||||
|
||||
throw new Error('Invite already used');
|
||||
}
|
||||
|
||||
const inviteEvent = sql`
|
||||
SELECT * FROM events WHERE kind = 9999 AND id IN (
|
||||
SELECT event_id FROM event_tags WHERE tag_name = 'i' AND tag_id IN (
|
||||
SELECT tag_id FROM event_tags_values WHERE value_position = 1 AND value = ${shadContent}
|
||||
)
|
||||
)
|
||||
`(db)[0];
|
||||
|
||||
if (!inviteEvent) {
|
||||
log.debug('invite event not found', { tag: 'addEventToDb' });
|
||||
|
||||
logSecurityEvent({
|
||||
eventType: SecurityEventType.INVITE_VALIDATION_FAILURE,
|
||||
severity: SecuritySeverity.HIGH,
|
||||
source: 'invite_processing',
|
||||
details: {
|
||||
error: 'invite_event_not_found',
|
||||
invite_hash: shadContent,
|
||||
event_id: decryptedEvent.id,
|
||||
ccn_pubkey: ccnPubkey,
|
||||
},
|
||||
});
|
||||
|
||||
throw new Error('Invite event not found');
|
||||
}
|
||||
|
||||
const inviterPubkey = inviteEvent.pubkey;
|
||||
const inviteePubkey = decryptedEvent.pubkey;
|
||||
|
||||
db.run('BEGIN TRANSACTION');
|
||||
log.debug('inserting inviter_invitee and allowed_writes', {
|
||||
tag: 'addEventToDb',
|
||||
});
|
||||
sql`
|
||||
INSERT INTO inviter_invitee (ccn_pubkey, inviter_pubkey, invitee_pubkey, invite_hash) VALUES (${ccnPubkey}, ${inviterPubkey}, ${inviteePubkey}, ${shadContent})
|
||||
`(db);
|
||||
|
||||
sql`
|
||||
INSERT INTO allowed_writes (ccn_pubkey, pubkey) VALUES (${ccnPubkey}, ${inviteePubkey})
|
||||
`(db);
|
||||
|
||||
db.run('COMMIT TRANSACTION');
|
||||
log.debug('committed invite transaction', { tag: 'addEventToDb' });
|
||||
|
||||
const allowedPubkeys = sql`
|
||||
SELECT pubkey FROM allowed_writes WHERE ccn_pubkey = ${ccnPubkey}
|
||||
`(db).flatMap((row) => row.pubkey);
|
||||
const ccnName = sql`
|
||||
SELECT name FROM ccns WHERE pubkey = ${ccnPubkey}
|
||||
`(db)[0].name;
|
||||
|
||||
getCCNPrivateKeyByPubkey(ccnPubkey).then((ccnPrivateKey) => {
|
||||
if (!ccnPrivateKey) {
|
||||
log.error('CCN private key not found', { tag: 'addEventToDb' });
|
||||
throw new Error('CCN private key not found');
|
||||
}
|
||||
|
||||
const tags = allowedPubkeys.map((pubkey) => ['p', pubkey]);
|
||||
tags.push(['t', 'invite']);
|
||||
tags.push(['name', ccnName]);
|
||||
|
||||
const privateKeyEvent = nostrTools.finalizeEvent(
|
||||
nostrTools.nip13.minePow(
|
||||
{
|
||||
kind: 9998,
|
||||
created_at: Date.now(),
|
||||
content: base64.encode(ccnPrivateKey),
|
||||
tags,
|
||||
pubkey: ccnPubkey,
|
||||
},
|
||||
POW_TO_MINE,
|
||||
),
|
||||
ccnPrivateKey,
|
||||
);
|
||||
|
||||
const encryptedKeyEvent = createEncryptedEventForPubkey(
|
||||
inviteePubkey,
|
||||
privateKeyEvent,
|
||||
);
|
||||
publishToRelays(encryptedKeyEvent);
|
||||
log.debug('published encryptedKeyEvent to relays', {
|
||||
tag: 'addEventToDb',
|
||||
});
|
||||
});
|
||||
|
||||
return;
|
||||
}
|
||||
|
||||
const isAllowedWrite = sql`
|
||||
SELECT COUNT(*) as count FROM allowed_writes WHERE ccn_pubkey = ${ccnPubkey} AND pubkey = ${decryptedEvent.pubkey}
|
||||
`(db)[0].count;
|
||||
|
||||
if (isAllowedWrite === 0) {
|
||||
log.debug('not allowed to write to this CCN', {
|
||||
tag: 'addEventToDb',
|
||||
pubkey: decryptedEvent.pubkey,
|
||||
});
|
||||
|
||||
logCCNViolation(
|
||||
SecurityEventType.UNAUTHORIZED_WRITE_ATTEMPT,
|
||||
ccnPubkey,
|
||||
'write_event',
|
||||
{
|
||||
attempted_pubkey: decryptedEvent.pubkey,
|
||||
event_id: decryptedEvent.id,
|
||||
event_kind: decryptedEvent.kind,
|
||||
ccn_pubkey: ccnPubkey,
|
||||
},
|
||||
);
|
||||
|
||||
throw new Error('Not allowed to write to this CCN');
|
||||
}
|
||||
|
||||
try {
|
||||
db.run('BEGIN TRANSACTION');
|
||||
log.debug('begin transaction', { tag: 'addEventToDb' });
|
||||
|
||||
if (isReplaceableEvent(decryptedEvent.kind)) {
|
||||
log.debug('isReplaceableEvent, updating replaced events', {
|
||||
tag: 'addEventToDb',
|
||||
});
|
||||
sql`
|
||||
UPDATE events
|
||||
SET replaced = 1
|
||||
WHERE kind = ${decryptedEvent.kind}
|
||||
AND pubkey = ${decryptedEvent.pubkey}
|
||||
AND ccn_pubkey = ${ccnPubkey}
|
||||
AND (created_at < ${decryptedEvent.created_at} OR
|
||||
(created_at = ${decryptedEvent.created_at} AND id > ${decryptedEvent.id}))
|
||||
`(db);
|
||||
}
|
||||
|
||||
if (isAddressableEvent(decryptedEvent.kind)) {
|
||||
log.debug('isAddressableEvent, updating replaced events', {
|
||||
tag: 'addEventToDb',
|
||||
});
|
||||
const dTag = decryptedEvent.tags.find((tag) => tag[0] === 'd')?.[1];
|
||||
if (dTag) {
|
||||
sql`
|
||||
UPDATE events
|
||||
SET replaced = 1
|
||||
WHERE kind = ${decryptedEvent.kind}
|
||||
AND pubkey = ${decryptedEvent.pubkey}
|
||||
AND ccn_pubkey = ${ccnPubkey}
|
||||
AND (created_at < ${decryptedEvent.created_at} OR
|
||||
(created_at = ${decryptedEvent.created_at} AND id > ${decryptedEvent.id}))
|
||||
AND id IN (
|
||||
SELECT event_id FROM event_tags
|
||||
WHERE tag_name = 'd'
|
||||
AND tag_id IN (
|
||||
SELECT tag_id FROM event_tags_values
|
||||
WHERE value_position = 1
|
||||
AND value = ${dTag}
|
||||
)
|
||||
)
|
||||
`(db);
|
||||
}
|
||||
}
|
||||
|
||||
if (isCCNReplaceableEvent(decryptedEvent.kind)) {
|
||||
log.debug('isCCNReplaceableEvent, updating replaced events', {
|
||||
tag: 'addEventToDb',
|
||||
});
|
||||
const dTag = decryptedEvent.tags.find((tag) => tag[0] === 'd')?.[1];
|
||||
log.debug('dTag', { tag: 'addEventToDb', dTag });
|
||||
if (dTag) {
|
||||
sql`
|
||||
UPDATE events
|
||||
SET replaced = 1
|
||||
WHERE kind = ${decryptedEvent.kind}
|
||||
AND ccn_pubkey = ${ccnPubkey}
|
||||
AND (created_at < ${decryptedEvent.created_at} OR
|
||||
(created_at = ${decryptedEvent.created_at} AND id > ${decryptedEvent.id}))
|
||||
AND id IN (
|
||||
SELECT event_id FROM event_tags
|
||||
WHERE tag_name = 'd'
|
||||
AND tag_id IN (
|
||||
SELECT tag_id FROM event_tags_values
|
||||
WHERE value_position = 1
|
||||
AND value = ${dTag}
|
||||
)
|
||||
)
|
||||
`(db);
|
||||
} else {
|
||||
sql`
|
||||
UPDATE events
|
||||
SET replaced = 1
|
||||
WHERE kind = ${decryptedEvent.kind}
|
||||
AND ccn_pubkey = ${ccnPubkey}
|
||||
AND (created_at < ${decryptedEvent.created_at} OR
|
||||
(created_at = ${decryptedEvent.created_at} AND id > ${decryptedEvent.id}))
|
||||
`(db);
|
||||
}
|
||||
}
|
||||
|
||||
sql`
|
||||
INSERT INTO events (id, original_id, pubkey, created_at, kind, content, sig, first_seen, ccn_pubkey) VALUES (
|
||||
${decryptedEvent.id},
|
||||
${encryptedEvent.id},
|
||||
${decryptedEvent.pubkey},
|
||||
${decryptedEvent.created_at},
|
||||
${decryptedEvent.kind},
|
||||
${decryptedEvent.content},
|
||||
${decryptedEvent.sig},
|
||||
unixepoch(),
|
||||
${ccnPubkey}
|
||||
)
|
||||
`(db);
|
||||
log.debug('inserted event', { tag: 'addEventToDb', id: decryptedEvent.id });
|
||||
if (decryptedEvent.tags) {
|
||||
for (let i = 0; i < decryptedEvent.tags.length; i++) {
|
||||
const tag = sql`
|
||||
INSERT INTO event_tags(event_id, tag_name, tag_index) VALUES (
|
||||
${decryptedEvent.id},
|
||||
${decryptedEvent.tags[i][0]},
|
||||
${i}
|
||||
) RETURNING tag_id
|
||||
`(db)[0];
|
||||
for (let j = 1; j < decryptedEvent.tags[i].length; j++) {
|
||||
sql`
|
||||
INSERT INTO event_tags_values(tag_id, value_position, value) VALUES (
|
||||
${tag.tag_id},
|
||||
${j},
|
||||
${decryptedEvent.tags[i][j]}
|
||||
)
|
||||
`(db);
|
||||
}
|
||||
}
|
||||
log.debug('inserted tags for event', {
|
||||
tag: 'addEventToDb',
|
||||
id: decryptedEvent.id,
|
||||
});
|
||||
}
|
||||
db.run('COMMIT TRANSACTION');
|
||||
log.debug('committed transaction', { tag: 'addEventToDb' });
|
||||
} catch (e) {
|
||||
db.run('ROLLBACK TRANSACTION');
|
||||
log.error('transaction rolled back', { tag: 'addEventToDb', error: e });
|
||||
throw e;
|
||||
}
|
||||
log.debug('end', { tag: 'addEventToDb', id: decryptedEvent.id });
|
||||
}
|
143
src/dbEvents/deletionEvent.ts
Normal file
143
src/dbEvents/deletionEvent.ts
Normal file
|
@ -0,0 +1,143 @@
|
|||
import type { Database } from '@db/sqlite';
|
||||
import type * as nostrTools from '@nostr/tools';
|
||||
import { isAddressableEvent, isReplaceableEvent } from '../utils/eventTypes.ts';
|
||||
import { log } from '../utils/logs.ts';
|
||||
import { parseATagQuery } from '../utils/parseATagQuery.ts';
|
||||
import { sql } from '../utils/queries.ts';
|
||||
|
||||
export function handleDeletionEvent(
|
||||
db: Database,
|
||||
deletionEvent: nostrTools.VerifiedEvent,
|
||||
encryptedEvent: nostrTools.VerifiedEvent,
|
||||
ccnPubkey: string,
|
||||
): void {
|
||||
log.debug('start', {
|
||||
tag: 'handleDeletionEvent',
|
||||
decryptId: deletionEvent.id,
|
||||
encryptedId: encryptedEvent.id,
|
||||
kind: deletionEvent.kind,
|
||||
ccnPubkey,
|
||||
});
|
||||
|
||||
try {
|
||||
db.run('BEGIN TRANSACTION');
|
||||
log.debug('begin transaction', { tag: 'handleDeletionEvent' });
|
||||
|
||||
sql`
|
||||
INSERT INTO events (id, original_id, pubkey, created_at, kind, content, sig, first_seen, ccn_pubkey) VALUES (
|
||||
${deletionEvent.id},
|
||||
${encryptedEvent.id},
|
||||
${deletionEvent.pubkey},
|
||||
${deletionEvent.created_at},
|
||||
${deletionEvent.kind},
|
||||
${deletionEvent.content},
|
||||
${deletionEvent.sig},
|
||||
unixepoch(),
|
||||
${ccnPubkey}
|
||||
)
|
||||
`(db);
|
||||
|
||||
if (deletionEvent.tags) {
|
||||
for (let i = 0; i < deletionEvent.tags.length; i++) {
|
||||
const tag = sql`
|
||||
INSERT INTO event_tags(event_id, tag_name, tag_index) VALUES (
|
||||
${deletionEvent.id},
|
||||
${deletionEvent.tags[i][0]},
|
||||
${i}
|
||||
) RETURNING tag_id
|
||||
`(db)[0];
|
||||
|
||||
for (let j = 1; j < deletionEvent.tags[i].length; j++) {
|
||||
sql`
|
||||
INSERT INTO event_tags_values(tag_id, value_position, value) VALUES (
|
||||
${tag.tag_id},
|
||||
${j},
|
||||
${deletionEvent.tags[i][j]}
|
||||
)
|
||||
`(db);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
for (const tag of deletionEvent.tags) {
|
||||
if (tag[0] === 'e' && tag[1]) {
|
||||
sql`
|
||||
UPDATE events
|
||||
SET deleted = 1
|
||||
WHERE id = ${tag[1]}
|
||||
AND pubkey = ${deletionEvent.pubkey}
|
||||
AND ccn_pubkey = ${ccnPubkey}
|
||||
`(db);
|
||||
log.debug('deleted event by id', {
|
||||
tag: 'handleDeletionEvent',
|
||||
eventId: tag[1],
|
||||
});
|
||||
} else if (tag[0] === 'a' && tag[1]) {
|
||||
const { kind, pubkey, dTag } = parseATagQuery(tag[1]);
|
||||
if (!kind || !pubkey) continue;
|
||||
if (pubkey !== deletionEvent.pubkey) continue;
|
||||
if (isReplaceableEvent(kind)) {
|
||||
sql`
|
||||
UPDATE events
|
||||
SET deleted = 1
|
||||
WHERE kind = ${kind}
|
||||
AND pubkey = ${pubkey}
|
||||
AND ccn_pubkey = ${ccnPubkey}
|
||||
AND created_at <= ${deletionEvent.created_at}
|
||||
`(db);
|
||||
log.debug('deleted replaceable event', {
|
||||
tag: 'handleDeletionEvent',
|
||||
kind,
|
||||
pubkey,
|
||||
});
|
||||
} else if (isAddressableEvent(kind) && dTag) {
|
||||
sql`
|
||||
UPDATE events
|
||||
SET deleted = 1
|
||||
WHERE kind = ${kind}
|
||||
AND pubkey = ${pubkey}
|
||||
AND ccn_pubkey = ${ccnPubkey}
|
||||
AND created_at <= ${deletionEvent.created_at}
|
||||
AND id IN (
|
||||
SELECT event_id FROM event_tags
|
||||
WHERE tag_name = 'd'
|
||||
AND tag_id IN (
|
||||
SELECT tag_id FROM event_tags_values
|
||||
WHERE value_position = 1 AND value = ${dTag}
|
||||
)
|
||||
)
|
||||
`(db);
|
||||
log.debug('deleted addressable event', {
|
||||
tag: 'handleDeletionEvent',
|
||||
kind,
|
||||
pubkey,
|
||||
dTag,
|
||||
});
|
||||
}
|
||||
} else if (tag[0] === 'k') {
|
||||
sql`
|
||||
UPDATE events
|
||||
SET deleted = 1
|
||||
WHERE kind = ${tag[1]}
|
||||
AND pubkey = ${deletionEvent.pubkey}
|
||||
AND ccn_pubkey = ${ccnPubkey}
|
||||
AND created_at <= ${deletionEvent.created_at}
|
||||
`(db);
|
||||
log.debug('deleted events of kind', {
|
||||
tag: 'handleDeletionEvent',
|
||||
kind: tag[1],
|
||||
});
|
||||
}
|
||||
}
|
||||
db.run('COMMIT TRANSACTION');
|
||||
log.debug('committed transaction', { tag: 'handleDeletionEvent' });
|
||||
} catch (e) {
|
||||
db.run('ROLLBACK TRANSACTION');
|
||||
log.error('transaction rolled back', {
|
||||
tag: 'handleDeletionEvent',
|
||||
error: e,
|
||||
});
|
||||
throw e;
|
||||
}
|
||||
log.debug('end', { tag: 'handleDeletionEvent', id: deletionEvent.id });
|
||||
}
|
401
src/eventEncryptionDecryption.ts
Normal file
401
src/eventEncryptionDecryption.ts
Normal file
|
@ -0,0 +1,401 @@
|
|||
import type { Database } from '@db/sqlite';
|
||||
import * as nostrTools from '@nostr/tools';
|
||||
import { nip44 } from '@nostr/tools';
|
||||
import { MAX_CHUNK_SIZE, MIN_POW, POW_TO_MINE } from './consts.ts';
|
||||
import { getActiveCCN } from './utils/getActiveCCN.ts';
|
||||
import { getAllCCNs } from './utils/getAllCCNs.ts';
|
||||
import { getCCNPrivateKeyByPubkey } from './utils/getCCNPrivateKeyByPubkey.ts';
|
||||
import { log } from './utils/logs.ts';
|
||||
import { None, type Option, Some, flatMap, map } from './utils/option.ts';
|
||||
import { sql } from './utils/queries.ts';
|
||||
import { randomTimeUpTo2DaysInThePast } from './utils/randomTimeUpTo2DaysInThePast.ts';
|
||||
|
||||
export class EventAlreadyExistsException extends Error {}
|
||||
export class ChunkedEventReceived extends Error {}
|
||||
|
||||
export function createEncryptedEventForPubkey(
|
||||
pubkey: string,
|
||||
event: nostrTools.VerifiedEvent,
|
||||
) {
|
||||
const randomPrivateKey = nostrTools.generateSecretKey();
|
||||
const randomPrivateKeyPubKey = nostrTools.getPublicKey(randomPrivateKey);
|
||||
const conversationKey = nip44.getConversationKey(randomPrivateKey, pubkey);
|
||||
|
||||
const eventJson = JSON.stringify(event);
|
||||
const encryptedEvent = nip44.encrypt(eventJson, conversationKey);
|
||||
|
||||
const sealTemplate = {
|
||||
kind: 13,
|
||||
created_at: randomTimeUpTo2DaysInThePast(),
|
||||
content: encryptedEvent,
|
||||
tags: [],
|
||||
};
|
||||
|
||||
const seal = nostrTools.finalizeEvent(sealTemplate, randomPrivateKey);
|
||||
const giftWrapTemplate = {
|
||||
kind: 1059,
|
||||
created_at: randomTimeUpTo2DaysInThePast(),
|
||||
content: nip44.encrypt(JSON.stringify(seal), conversationKey),
|
||||
tags: [['p', pubkey]],
|
||||
pubkey: randomPrivateKeyPubKey,
|
||||
};
|
||||
const minedGiftWrap = nostrTools.nip13.minePow(giftWrapTemplate, POW_TO_MINE);
|
||||
|
||||
const giftWrap = nostrTools.finalizeEvent(minedGiftWrap, randomPrivateKey);
|
||||
return giftWrap;
|
||||
}
|
||||
|
||||
export function createEncryptedChunkForPubkey(
|
||||
pubkey: string,
|
||||
chunk: string,
|
||||
chunkIndex: number,
|
||||
totalChunks: number,
|
||||
messageId: string,
|
||||
privateKey: Uint8Array,
|
||||
) {
|
||||
const randomPrivateKey = nostrTools.generateSecretKey();
|
||||
const randomPrivateKeyPubKey = nostrTools.getPublicKey(randomPrivateKey);
|
||||
const conversationKey = nip44.getConversationKey(randomPrivateKey, pubkey);
|
||||
|
||||
const sealTemplate = {
|
||||
kind: 13,
|
||||
created_at: randomTimeUpTo2DaysInThePast(),
|
||||
content: nip44.encrypt(chunk, conversationKey),
|
||||
tags: [['chunk', String(chunkIndex), String(totalChunks), messageId]],
|
||||
};
|
||||
|
||||
const seal = nostrTools.finalizeEvent(sealTemplate, privateKey);
|
||||
const giftWrapTemplate = {
|
||||
kind: 1059,
|
||||
created_at: randomTimeUpTo2DaysInThePast(),
|
||||
content: nip44.encrypt(JSON.stringify(seal), conversationKey),
|
||||
tags: [['p', pubkey]],
|
||||
pubkey: randomPrivateKeyPubKey,
|
||||
};
|
||||
|
||||
const minedGiftWrap = nostrTools.nip13.minePow(giftWrapTemplate, POW_TO_MINE);
|
||||
const giftWrap = nostrTools.finalizeEvent(minedGiftWrap, randomPrivateKey);
|
||||
return giftWrap;
|
||||
}
|
||||
|
||||
export async function createEncryptedEvent(
|
||||
event: nostrTools.VerifiedEvent,
|
||||
db: Database,
|
||||
): Promise<nostrTools.VerifiedEvent | nostrTools.VerifiedEvent[]> {
|
||||
if (!event.id) throw new Error('Event must have an ID');
|
||||
if (!event.sig) throw new Error('Event must be signed');
|
||||
|
||||
const activeCCN = getActiveCCN(db);
|
||||
if (!activeCCN) throw new Error('No active CCN found');
|
||||
|
||||
const ccnPubKey = activeCCN.pubkey;
|
||||
const ccnPrivateKey = await getCCNPrivateKeyByPubkey(ccnPubKey);
|
||||
|
||||
const eventJson = JSON.stringify(event);
|
||||
if (eventJson.length <= MAX_CHUNK_SIZE) {
|
||||
return createEncryptedEventForPubkey(ccnPubKey, event);
|
||||
}
|
||||
|
||||
const chunks: string[] = [];
|
||||
for (let i = 0; i < eventJson.length; i += MAX_CHUNK_SIZE)
|
||||
chunks.push(eventJson.slice(i, i + MAX_CHUNK_SIZE));
|
||||
|
||||
const messageId = crypto.randomUUID();
|
||||
const totalChunks = chunks.length;
|
||||
|
||||
const encryptedChunks = [];
|
||||
for (let i = 0; i < chunks.length; i++) {
|
||||
const chunk = chunks[i];
|
||||
encryptedChunks.push(
|
||||
createEncryptedChunkForPubkey(
|
||||
ccnPubKey,
|
||||
chunk,
|
||||
i,
|
||||
totalChunks,
|
||||
messageId,
|
||||
ccnPrivateKey,
|
||||
),
|
||||
);
|
||||
}
|
||||
|
||||
return encryptedChunks;
|
||||
}
|
||||
|
||||
/**
|
||||
* Attempts to decrypt an event using a specific CCN private key
|
||||
* @returns The decrypted event with CCN pubkey if successful, None otherwise
|
||||
*/
|
||||
function attemptDecryptWithKey(
|
||||
event: nostrTools.Event,
|
||||
ccnPrivkey: Uint8Array,
|
||||
ccnPubkey: string,
|
||||
): Option<nostrTools.VerifiedEvent> {
|
||||
try {
|
||||
const conversationKey = nip44.getConversationKey(ccnPrivkey, event.pubkey);
|
||||
const sealResult = map(
|
||||
Some(nip44.decrypt(event.content, conversationKey)),
|
||||
JSON.parse,
|
||||
);
|
||||
|
||||
return flatMap(sealResult, (seal) => {
|
||||
if (!seal || seal.kind !== 13) return None();
|
||||
|
||||
const chunkTag = seal.tags.find((tag: string[]) => tag[0] === 'chunk');
|
||||
if (!chunkTag) {
|
||||
const contentResult = map(
|
||||
Some(nip44.decrypt(seal.content, conversationKey)),
|
||||
JSON.parse,
|
||||
);
|
||||
return map(contentResult, (content) => ({ ...content, ccnPubkey }));
|
||||
}
|
||||
|
||||
return None();
|
||||
});
|
||||
} catch {
|
||||
return None();
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Handles a chunked message by storing it in the database and checking if all chunks are received
|
||||
* @returns The complete decrypted event if all chunks are received, throws ChunkedEventReceived otherwise
|
||||
*/
|
||||
function handleChunkedMessage(
|
||||
db: Database,
|
||||
event: nostrTools.Event,
|
||||
ccnPrivkey: Uint8Array,
|
||||
ccnPubkey: string,
|
||||
): nostrTools.VerifiedEvent {
|
||||
const conversationKey = nip44.getConversationKey(ccnPrivkey, event.pubkey);
|
||||
const sealResult = map(
|
||||
Some(nip44.decrypt(event.content, conversationKey)),
|
||||
JSON.parse,
|
||||
);
|
||||
|
||||
const seal = sealResult.isSome ? sealResult.value : null;
|
||||
if (!seal) {
|
||||
throw new Error('Invalid chunked message format');
|
||||
}
|
||||
|
||||
const chunkTag = seal.tags.find((tag: string[]) => tag[0] === 'chunk');
|
||||
if (!chunkTag) {
|
||||
throw new Error('Invalid chunked message format');
|
||||
}
|
||||
|
||||
const [_, chunkIndexStr, totalChunksStr, messageId] = chunkTag;
|
||||
const chunkIndex = Number(chunkIndexStr);
|
||||
const totalChunks = Number(totalChunksStr);
|
||||
|
||||
if (!Number.isInteger(chunkIndex) || chunkIndex < 0) {
|
||||
throw new Error('Invalid chunk index');
|
||||
}
|
||||
if (
|
||||
!Number.isInteger(totalChunks) ||
|
||||
totalChunks <= 0 ||
|
||||
totalChunks > 1000
|
||||
) {
|
||||
throw new Error('Invalid total chunks count');
|
||||
}
|
||||
if (chunkIndex >= totalChunks) {
|
||||
throw new Error('Chunk index exceeds total chunks');
|
||||
}
|
||||
if (!messageId || typeof messageId !== 'string' || messageId.length > 100) {
|
||||
throw new Error('Invalid message ID');
|
||||
}
|
||||
|
||||
const chunk = nip44.decrypt(seal.content, conversationKey);
|
||||
|
||||
if (chunk.length > MAX_CHUNK_SIZE * 3) {
|
||||
throw new Error('Chunk content too large');
|
||||
}
|
||||
|
||||
let isMessageComplete = false;
|
||||
let reconstructedEvent: nostrTools.VerifiedEvent | null = null;
|
||||
|
||||
try {
|
||||
db.run('BEGIN IMMEDIATE TRANSACTION');
|
||||
|
||||
const insertStmt = db.prepare(`
|
||||
INSERT OR IGNORE INTO event_chunks
|
||||
(message_id, chunk_index, total_chunks, content, created_at, ccn_pubkey)
|
||||
VALUES (?, ?, ?, ?, ?, ?)
|
||||
`);
|
||||
|
||||
const insertResult = insertStmt.run(
|
||||
messageId,
|
||||
chunkIndex,
|
||||
totalChunks,
|
||||
chunk,
|
||||
Math.floor(Date.now() / 1000),
|
||||
ccnPubkey,
|
||||
);
|
||||
|
||||
if (insertResult === 0) {
|
||||
db.run('ROLLBACK TRANSACTION');
|
||||
throw new ChunkedEventReceived();
|
||||
}
|
||||
|
||||
const currentChunkCount = sql`
|
||||
SELECT COUNT(DISTINCT chunk_index) as count
|
||||
FROM event_chunks
|
||||
WHERE message_id = ${messageId}
|
||||
AND ccn_pubkey = ${ccnPubkey}
|
||||
AND total_chunks = ${totalChunks}
|
||||
`(db)[0].count;
|
||||
|
||||
if (currentChunkCount === totalChunks) {
|
||||
const chunkGapCheck = sql`
|
||||
SELECT COUNT(*) as count
|
||||
FROM event_chunks
|
||||
WHERE message_id = ${messageId}
|
||||
AND ccn_pubkey = ${ccnPubkey}
|
||||
AND chunk_index NOT IN (
|
||||
SELECT DISTINCT chunk_index
|
||||
FROM event_chunks
|
||||
WHERE message_id = ${messageId}
|
||||
AND ccn_pubkey = ${ccnPubkey}
|
||||
ORDER BY chunk_index
|
||||
LIMIT ${totalChunks}
|
||||
)
|
||||
`(db)[0].count;
|
||||
|
||||
if (chunkGapCheck > 0) {
|
||||
db.run('ROLLBACK TRANSACTION');
|
||||
throw new Error('Chunk sequence validation failed');
|
||||
}
|
||||
|
||||
const allChunks = sql`
|
||||
SELECT content, chunk_index
|
||||
FROM event_chunks
|
||||
WHERE message_id = ${messageId}
|
||||
AND ccn_pubkey = ${ccnPubkey}
|
||||
ORDER BY chunk_index
|
||||
`(db);
|
||||
|
||||
let fullContent = '';
|
||||
for (let i = 0; i < allChunks.length; i++) {
|
||||
const chunkData = allChunks[i];
|
||||
if (chunkData.chunk_index !== i) {
|
||||
db.run('ROLLBACK TRANSACTION');
|
||||
throw new Error('Chunk sequence integrity violation');
|
||||
}
|
||||
fullContent += chunkData.content;
|
||||
}
|
||||
|
||||
if (fullContent.length === 0) {
|
||||
db.run('ROLLBACK TRANSACTION');
|
||||
throw new Error('Empty reconstructed content');
|
||||
}
|
||||
|
||||
try {
|
||||
const content = JSON.parse(fullContent);
|
||||
reconstructedEvent = { ...content, ccnPubkey };
|
||||
isMessageComplete = true;
|
||||
|
||||
sql`
|
||||
DELETE FROM event_chunks
|
||||
WHERE message_id = ${messageId}
|
||||
AND ccn_pubkey = ${ccnPubkey}
|
||||
`(db);
|
||||
} catch {
|
||||
db.run('ROLLBACK TRANSACTION');
|
||||
throw new Error('Failed to parse reconstructed message content');
|
||||
}
|
||||
}
|
||||
|
||||
db.run('COMMIT TRANSACTION');
|
||||
} catch (error) {
|
||||
try {
|
||||
db.run('ROLLBACK TRANSACTION');
|
||||
} catch (rollbackError) {
|
||||
log.error('Failed to rollback transaction', {
|
||||
tag: 'handleChunkedMessage',
|
||||
error: rollbackError,
|
||||
});
|
||||
}
|
||||
throw error;
|
||||
}
|
||||
|
||||
if (isMessageComplete && reconstructedEvent) {
|
||||
return reconstructedEvent;
|
||||
}
|
||||
|
||||
throw new ChunkedEventReceived();
|
||||
}
|
||||
|
||||
export async function decryptEvent(
|
||||
db: Database,
|
||||
event: nostrTools.Event,
|
||||
): Promise<nostrTools.VerifiedEvent & { ccnPubkey: string }> {
|
||||
if (event.kind !== 1059) {
|
||||
throw new Error('Cannot decrypt event -- not a gift wrap');
|
||||
}
|
||||
|
||||
const allCCNs = getAllCCNs(db);
|
||||
if (allCCNs.length === 0) {
|
||||
throw new Error('No CCNs found');
|
||||
}
|
||||
|
||||
if (
|
||||
nostrTools.nip13.getPow(event.id) < MIN_POW &&
|
||||
!event.tags.some((t) => t[0] === 'type' && t[1] === 'invite')
|
||||
) {
|
||||
throw new Error('Cannot decrypt event -- PoW too low');
|
||||
}
|
||||
|
||||
const isInvite =
|
||||
event.tags.findIndex(
|
||||
(tag: string[]) => tag[0] === 'type' && tag[1] === 'invite',
|
||||
) !== -1;
|
||||
const eventDestination = event.tags.find(
|
||||
(tag: string[]) => tag[0] === 'p',
|
||||
)?.[1];
|
||||
|
||||
if (!eventDestination) {
|
||||
throw new Error('Cannot decrypt event -- no destination');
|
||||
}
|
||||
|
||||
if (isInvite) {
|
||||
const ccnPrivkey = await getCCNPrivateKeyByPubkey(eventDestination);
|
||||
const decryptedEvent = attemptDecryptWithKey(
|
||||
event,
|
||||
ccnPrivkey,
|
||||
eventDestination,
|
||||
);
|
||||
if (decryptedEvent.isSome) {
|
||||
const recipient = decryptedEvent.value.tags.find(
|
||||
(tag: string[]) => tag[0] === 'p',
|
||||
)?.[1];
|
||||
if (recipient !== eventDestination)
|
||||
throw new Error('Cannot decrypt invite');
|
||||
return { ...decryptedEvent.value, ccnPubkey: eventDestination };
|
||||
}
|
||||
throw new Error('Cannot decrypt invite');
|
||||
}
|
||||
|
||||
const ccnPrivkey = await getCCNPrivateKeyByPubkey(eventDestination);
|
||||
const decryptedEvent = attemptDecryptWithKey(
|
||||
event,
|
||||
ccnPrivkey,
|
||||
eventDestination,
|
||||
);
|
||||
if (decryptedEvent.isSome) {
|
||||
return { ...decryptedEvent.value, ccnPubkey: eventDestination };
|
||||
}
|
||||
|
||||
try {
|
||||
const chunked = handleChunkedMessage(
|
||||
db,
|
||||
event,
|
||||
ccnPrivkey,
|
||||
eventDestination,
|
||||
);
|
||||
return { ...chunked, ccnPubkey: eventDestination };
|
||||
} catch (e) {
|
||||
if (e instanceof ChunkedEventReceived) {
|
||||
throw e;
|
||||
}
|
||||
}
|
||||
|
||||
throw new Error('Failed to decrypt event with any CCN key');
|
||||
}
|
397
src/index.ts
Normal file
397
src/index.ts
Normal file
|
@ -0,0 +1,397 @@
|
|||
import { randomBytes } from '@noble/ciphers/webcrypto';
|
||||
import * as nostrTools from '@nostr/tools';
|
||||
import { Database } from 'jsr:@db/sqlite';
|
||||
import { NSchema as n } from 'jsr:@nostrify/nostrify';
|
||||
import { encodeBase64 } from 'jsr:@std/encoding@0.224/base64';
|
||||
import { UserConnection } from './UserConnection.ts';
|
||||
import { handleCCNCommands } from './commands/ccn.ts';
|
||||
import { handleClose } from './commands/close.ts';
|
||||
import { handleEvent } from './commands/event.ts';
|
||||
import { handleRequest } from './commands/request.ts';
|
||||
import { CHUNK_CLEANUP_INTERVAL, QUEUE_PROCESS_INTERVAL } from './consts.ts';
|
||||
import { addEventToDb } from './dbEvents/addEventToDb.ts';
|
||||
import {
|
||||
ChunkedEventReceived,
|
||||
EventAlreadyExistsException,
|
||||
decryptEvent,
|
||||
} from './eventEncryptionDecryption.ts';
|
||||
import { runMigrations } from './migrations.ts';
|
||||
import { pool, relays } from './relays.ts';
|
||||
import { cleanupOldChunks } from './utils/cleanupOldChunks.ts';
|
||||
import { getEveFilePath } from './utils/files.ts';
|
||||
import { getEncryptedEventByOriginalId } from './utils/getEncryptedEventByOriginalId.ts';
|
||||
import { isArray } from './utils/isArray.ts';
|
||||
import { isLocalhost } from './utils/isLocalhost.ts';
|
||||
import { isValidJSON } from './utils/isValidJSON.ts';
|
||||
import {
|
||||
isOriginalEventIdCached,
|
||||
updateKnownEventsCache,
|
||||
} from './utils/knownEventsCache.ts';
|
||||
import { log, setupLogger } from './utils/logs.ts';
|
||||
import {
|
||||
getQueueStats,
|
||||
processOutboundQueue,
|
||||
processStartupQueue,
|
||||
} from './utils/outboundQueue.ts';
|
||||
import { sql } from './utils/queries.ts';
|
||||
import {
|
||||
SecurityEventType,
|
||||
SecuritySeverity,
|
||||
logSecurityEvent,
|
||||
logSystemEvent,
|
||||
} from './utils/securityLogs.ts';
|
||||
|
||||
await setupLogger(null);
|
||||
await Deno.mkdir(await getEveFilePath('ccn_keys'), { recursive: true });
|
||||
|
||||
logSystemEvent(SecurityEventType.SYSTEM_STARTUP, {
|
||||
version: 'alpha',
|
||||
deno_version: Deno.version.deno,
|
||||
timestamp: new Date().toISOString(),
|
||||
});
|
||||
|
||||
if (!Deno.env.has('ENCRYPTION_KEY')) {
|
||||
const newKey = encodeBase64(randomBytes(32));
|
||||
log.error(
|
||||
`Missing ENCRYPTION_KEY. Please set it in your env.\nA new one has been generated for you: ENCRYPTION_KEY="${newKey}"`,
|
||||
);
|
||||
|
||||
logSecurityEvent({
|
||||
eventType: SecurityEventType.CONFIGURATION_LOADED,
|
||||
severity: SecuritySeverity.CRITICAL,
|
||||
source: 'startup',
|
||||
details: {
|
||||
error: 'missing_encryption_key',
|
||||
generated_new_key: true,
|
||||
},
|
||||
});
|
||||
|
||||
Deno.exit(1);
|
||||
}
|
||||
|
||||
logSystemEvent(SecurityEventType.CONFIGURATION_LOADED, {
|
||||
encryption_key_present: true,
|
||||
ccn_keys_directory: await getEveFilePath('ccn_keys'),
|
||||
});
|
||||
|
||||
export const db = new Database(await getEveFilePath('db'));
|
||||
await setupLogger(db);
|
||||
|
||||
/**
|
||||
* Creates a subscription event handler for processing encrypted events.
|
||||
* This handler decrypts and adds valid events to the database.
|
||||
* @param database The database instance to use
|
||||
* @returns An event handler function
|
||||
*/
|
||||
function createSubscriptionEventHandler(db: Database) {
|
||||
return async (event: nostrTools.Event) => {
|
||||
if (isOriginalEventIdCached(event.id)) return;
|
||||
if (!nostrTools.verifyEvent(event)) {
|
||||
log.warn('Invalid event received');
|
||||
|
||||
logSecurityEvent({
|
||||
eventType: SecurityEventType.INVALID_SIGNATURE,
|
||||
severity: SecuritySeverity.MEDIUM,
|
||||
source: 'event_processing',
|
||||
details: {
|
||||
event_id: event.id,
|
||||
event_kind: event.kind,
|
||||
pubkey: event.pubkey,
|
||||
},
|
||||
});
|
||||
|
||||
return;
|
||||
}
|
||||
if (getEncryptedEventByOriginalId(db, event)) return;
|
||||
try {
|
||||
const decryptedEvent = await decryptEvent(db, event);
|
||||
addEventToDb(db, decryptedEvent, event, decryptedEvent.ccnPubkey);
|
||||
updateKnownEventsCache();
|
||||
} catch (e) {
|
||||
if (e instanceof EventAlreadyExistsException) {
|
||||
logSecurityEvent({
|
||||
eventType: SecurityEventType.DUPLICATE_EVENT_BLOCKED,
|
||||
severity: SecuritySeverity.LOW,
|
||||
source: 'event_processing',
|
||||
details: {
|
||||
event_id: event.id,
|
||||
reason: 'event_already_exists',
|
||||
},
|
||||
});
|
||||
return;
|
||||
}
|
||||
if (e instanceof ChunkedEventReceived) {
|
||||
logSecurityEvent({
|
||||
eventType: SecurityEventType.CHUNKED_EVENT_RECEIVED,
|
||||
severity: SecuritySeverity.LOW,
|
||||
source: 'event_processing',
|
||||
details: {
|
||||
event_id: event.id,
|
||||
event_kind: event.kind,
|
||||
},
|
||||
});
|
||||
return;
|
||||
}
|
||||
|
||||
logSecurityEvent({
|
||||
eventType: SecurityEventType.DECRYPTION_FAILURE,
|
||||
severity: SecuritySeverity.MEDIUM,
|
||||
source: 'event_processing',
|
||||
details: {
|
||||
operation: 'event_decryption',
|
||||
event_id: event.id,
|
||||
event_kind: event.kind,
|
||||
error_message: e instanceof Error ? e.message : 'Unknown error',
|
||||
},
|
||||
});
|
||||
}
|
||||
};
|
||||
}
|
||||
|
||||
function setupAndSubscribeToExternalEvents() {
|
||||
const isInitialized = sql`
|
||||
SELECT name FROM sqlite_master WHERE type='table' AND name='migration_history'
|
||||
`(db)[0];
|
||||
|
||||
if (!isInitialized) runMigrations(db, -1);
|
||||
|
||||
const latestVersion =
|
||||
sql`
|
||||
SELECT migration_version FROM migration_history WHERE status = 'success' ORDER BY migration_version DESC LIMIT 1
|
||||
`(db)[0]?.migration_version ?? -1;
|
||||
|
||||
runMigrations(db, latestVersion);
|
||||
|
||||
logSystemEvent(SecurityEventType.SYSTEM_STARTUP, {
|
||||
database_logging_enabled: true,
|
||||
migrations_complete: true,
|
||||
});
|
||||
|
||||
const allCCNs = sql`SELECT pubkey FROM ccns`(db);
|
||||
const ccnPubkeys = allCCNs.map((ccn) => ccn.pubkey);
|
||||
|
||||
pool.subscribeMany(
|
||||
relays,
|
||||
[
|
||||
{
|
||||
'#p': ccnPubkeys,
|
||||
kinds: [1059],
|
||||
},
|
||||
],
|
||||
{
|
||||
onevent: createSubscriptionEventHandler(db),
|
||||
},
|
||||
);
|
||||
|
||||
updateKnownEventsCache();
|
||||
setInterval(() => cleanupOldChunks(db), CHUNK_CLEANUP_INTERVAL);
|
||||
|
||||
processStartupQueue(db).catch((error: unknown) => {
|
||||
log.error('Startup queue processing failed', {
|
||||
tag: 'outboundQueue',
|
||||
error: error instanceof Error ? error.message : 'Unknown error',
|
||||
});
|
||||
});
|
||||
|
||||
setInterval(async () => {
|
||||
try {
|
||||
await processOutboundQueue(db);
|
||||
|
||||
const stats = getQueueStats(db);
|
||||
if (stats.total > 0) {
|
||||
log.info('Outbound queue status', {
|
||||
tag: 'outboundQueue',
|
||||
...stats,
|
||||
});
|
||||
}
|
||||
} catch (error) {
|
||||
log.error('Error processing outbound queue', {
|
||||
tag: 'outboundQueue',
|
||||
error: error instanceof Error ? error.message : 'Unknown error',
|
||||
});
|
||||
}
|
||||
}, QUEUE_PROCESS_INTERVAL);
|
||||
|
||||
processOutboundQueue(db).catch((error) => {
|
||||
log.error('Initial queue processing failed', {
|
||||
tag: 'outboundQueue',
|
||||
error: error instanceof Error ? error.message : 'Unknown error',
|
||||
});
|
||||
});
|
||||
|
||||
log.info('Outbound queue processor started', {
|
||||
tag: 'outboundQueue',
|
||||
interval: QUEUE_PROCESS_INTERVAL,
|
||||
});
|
||||
}
|
||||
|
||||
setupAndSubscribeToExternalEvents();
|
||||
|
||||
export function handleSocketError(
|
||||
connection: UserConnection,
|
||||
operation: string,
|
||||
error: unknown,
|
||||
): void {
|
||||
const errorMessage = error instanceof Error ? error.message : 'Unknown error';
|
||||
log.error(`Error ${operation}: ${errorMessage}`);
|
||||
connection.sendNotice(`Failed to ${operation}`);
|
||||
}
|
||||
|
||||
Deno.serve({
|
||||
port: 6942,
|
||||
handler: (request, connInfo) => {
|
||||
if (request.headers.get('upgrade') === 'websocket') {
|
||||
if (!isLocalhost(request, connInfo)) {
|
||||
logSecurityEvent({
|
||||
eventType: SecurityEventType.NON_LOCALHOST_CONNECTION_BLOCKED,
|
||||
severity: SecuritySeverity.HIGH,
|
||||
source: 'connection_security',
|
||||
details: {
|
||||
remote_addr: connInfo?.remoteAddr,
|
||||
user_agent: request.headers.get('user-agent'),
|
||||
host: request.headers.get('host'),
|
||||
origin: request.headers.get('origin'),
|
||||
},
|
||||
remoteAddr:
|
||||
connInfo?.remoteAddr?.transport === 'tcp'
|
||||
? connInfo.remoteAddr.hostname
|
||||
: 'unknown',
|
||||
});
|
||||
|
||||
return new Response(
|
||||
'Forbidden. Please read the Arx-CCN documentation for more information on how to interact with the relay.',
|
||||
{ status: 403 },
|
||||
);
|
||||
}
|
||||
|
||||
log.info('upgrading connection', { tag: 'WebSocket' });
|
||||
const { socket, response } = Deno.upgradeWebSocket(request);
|
||||
|
||||
const connection = new UserConnection(socket, new Map(), db);
|
||||
|
||||
socket.onopen = () => {
|
||||
log.info('User connected');
|
||||
|
||||
logSecurityEvent({
|
||||
eventType: SecurityEventType.WEBSOCKET_CONNECTION_ESTABLISHED,
|
||||
severity: SecuritySeverity.LOW,
|
||||
source: 'connection_security',
|
||||
details: {
|
||||
remote_addr: connInfo?.remoteAddr,
|
||||
user_agent: request.headers.get('user-agent'),
|
||||
connection_time: new Date().toISOString(),
|
||||
},
|
||||
remoteAddr:
|
||||
connInfo?.remoteAddr?.transport === 'tcp'
|
||||
? connInfo.remoteAddr.hostname
|
||||
: 'localhost',
|
||||
});
|
||||
};
|
||||
|
||||
socket.onmessage = (event) => {
|
||||
log.debug('Received', {
|
||||
tag: 'WebSocket',
|
||||
data: event.data,
|
||||
});
|
||||
|
||||
if (typeof event.data !== 'string' || !isValidJSON(event.data)) {
|
||||
log.warn('Invalid request', { tag: 'WebSocket' });
|
||||
|
||||
logSecurityEvent({
|
||||
eventType: SecurityEventType.MALFORMED_EVENT,
|
||||
severity: SecuritySeverity.MEDIUM,
|
||||
source: 'websocket_handler',
|
||||
details: {
|
||||
data_type: typeof event.data,
|
||||
is_valid_json: isValidJSON(event.data),
|
||||
data_length: event.data?.length || 0,
|
||||
},
|
||||
remoteAddr:
|
||||
connInfo?.remoteAddr?.transport === 'tcp'
|
||||
? connInfo.remoteAddr.hostname
|
||||
: 'localhost',
|
||||
});
|
||||
|
||||
return;
|
||||
}
|
||||
|
||||
const data = JSON.parse(event.data);
|
||||
if (!isArray(data)) {
|
||||
logSecurityEvent({
|
||||
eventType: SecurityEventType.MALFORMED_EVENT,
|
||||
severity: SecuritySeverity.MEDIUM,
|
||||
source: 'websocket_handler',
|
||||
details: {
|
||||
error: 'message_not_array',
|
||||
received_type: typeof data,
|
||||
},
|
||||
remoteAddr:
|
||||
connInfo?.remoteAddr?.transport === 'tcp'
|
||||
? connInfo.remoteAddr.hostname
|
||||
: 'localhost',
|
||||
});
|
||||
|
||||
return log.warn('Invalid request', { tag: 'WebSocket' });
|
||||
}
|
||||
|
||||
const msgType = data[0];
|
||||
log.debug('received message', { tag: 'WebSocket', msgType });
|
||||
|
||||
switch (msgType) {
|
||||
case 'REQ':
|
||||
return handleRequest(connection, n.clientREQ().parse(data));
|
||||
case 'EVENT':
|
||||
return handleEvent(connection, n.clientEVENT().parse(data)[1]);
|
||||
case 'CLOSE':
|
||||
return handleClose(connection, n.clientCLOSE().parse(data)[1]);
|
||||
case 'CCN':
|
||||
return handleCCNCommands(connection, data[1] as string, data[2]);
|
||||
default:
|
||||
logSecurityEvent({
|
||||
eventType: SecurityEventType.MALFORMED_EVENT,
|
||||
severity: SecuritySeverity.MEDIUM,
|
||||
source: 'websocket_handler',
|
||||
details: {
|
||||
error: 'unknown_message_type',
|
||||
message_type: msgType,
|
||||
data_preview: data.slice(0, 3), // First 3 elements for debugging
|
||||
},
|
||||
remoteAddr:
|
||||
connInfo?.remoteAddr?.transport === 'tcp'
|
||||
? connInfo.remoteAddr.hostname
|
||||
: 'localhost',
|
||||
});
|
||||
|
||||
log.warn('Invalid request', { tag: 'WebSocket' });
|
||||
return;
|
||||
}
|
||||
};
|
||||
|
||||
socket.onclose = () => {
|
||||
log.info('User disconnected');
|
||||
|
||||
logSecurityEvent({
|
||||
eventType: SecurityEventType.WEBSOCKET_CONNECTION_CLOSED,
|
||||
severity: SecuritySeverity.LOW,
|
||||
source: 'connection_security',
|
||||
details: {
|
||||
disconnect_time: new Date().toISOString(),
|
||||
subscriptions_count: connection.subscriptions.size,
|
||||
},
|
||||
remoteAddr:
|
||||
connInfo?.remoteAddr?.transport === 'tcp'
|
||||
? connInfo.remoteAddr.hostname
|
||||
: 'localhost',
|
||||
});
|
||||
};
|
||||
|
||||
return response;
|
||||
}
|
||||
return new Response(
|
||||
Deno.readTextFileSync(`${import.meta.dirname}/../public/landing.html`),
|
||||
{
|
||||
headers: { 'Content-Type': 'text/html' },
|
||||
},
|
||||
);
|
||||
},
|
||||
});
|
55
src/migrations.ts
Normal file
55
src/migrations.ts
Normal file
|
@ -0,0 +1,55 @@
|
|||
import type { Database } from '@db/sqlite';
|
||||
import { log } from './utils/logs.ts';
|
||||
import { sql } from './utils/queries.ts';
|
||||
|
||||
export function runMigrations(db: Database, latestVersion: number) {
|
||||
const migrations = [
|
||||
...Deno.readDirSync(`${import.meta.dirname}/../migrations`),
|
||||
];
|
||||
migrations.sort((a, b) => {
|
||||
const aVersion = Number.parseInt(a.name.split('-')[0], 10);
|
||||
const bVersion = Number.parseInt(b.name.split('-')[0], 10);
|
||||
return aVersion - bVersion;
|
||||
});
|
||||
for (const migrationFile of migrations) {
|
||||
const migrationVersion = Number.parseInt(
|
||||
migrationFile.name.split('-')[0],
|
||||
10,
|
||||
);
|
||||
|
||||
if (migrationVersion > latestVersion) {
|
||||
log.info(
|
||||
`Running migration ${migrationFile.name} (version ${migrationVersion})`,
|
||||
);
|
||||
const start = Date.now();
|
||||
const migrationSql = Deno.readTextFileSync(
|
||||
`${import.meta.dirname}/../migrations/${migrationFile.name}`,
|
||||
);
|
||||
db.run('BEGIN TRANSACTION');
|
||||
try {
|
||||
db.run(migrationSql);
|
||||
const end = Date.now();
|
||||
const durationMs = end - start;
|
||||
sql`
|
||||
INSERT INTO migration_history (migration_version, migration_name, executed_at, duration_ms, status) VALUES (${migrationVersion}, ${migrationFile.name}, ${new Date().toISOString()}, ${durationMs}, 'success');
|
||||
db.run("COMMIT TRANSACTION");
|
||||
`(db);
|
||||
} catch (e) {
|
||||
db.run('ROLLBACK TRANSACTION');
|
||||
const error =
|
||||
e instanceof Error
|
||||
? e
|
||||
: typeof e === 'string'
|
||||
? new Error(e)
|
||||
: new Error(JSON.stringify(e));
|
||||
const end = Date.now();
|
||||
const durationMs = end - start;
|
||||
sql`
|
||||
INSERT INTO migration_history (migration_version, migration_name, executed_at, duration_ms, status, error_message) VALUES (${migrationVersion}, ${migrationFile.name}, ${new Date().toISOString()}, ${durationMs}, 'failed', ${error.message});
|
||||
`(db);
|
||||
throw e;
|
||||
}
|
||||
db.run('END TRANSACTION');
|
||||
}
|
||||
}
|
||||
}
|
39
src/relays.ts
Normal file
39
src/relays.ts
Normal file
|
@ -0,0 +1,39 @@
|
|||
import * as nostrTools from '@nostr/tools';
|
||||
import { isArray } from './utils/isArray.ts';
|
||||
|
||||
export const pool = new nostrTools.SimplePool();
|
||||
export const relays = [
|
||||
'wss://relay.arx-ccn.com/',
|
||||
'wss://relay.dannymorabito.com/',
|
||||
'wss://nos.lol/',
|
||||
'wss://nostr.einundzwanzig.space/',
|
||||
'wss://nostr.massmux.com/',
|
||||
'wss://nostr.mom/',
|
||||
'wss://nostr.wine/',
|
||||
'wss://purplerelay.com/',
|
||||
'wss://relay.damus.io/',
|
||||
'wss://relay.goodmorningbitcoin.com/',
|
||||
'wss://relay.lexingtonbitcoin.org/',
|
||||
'wss://relay.nostr.band/',
|
||||
'wss://relay.primal.net/',
|
||||
'wss://relay.snort.social/',
|
||||
'wss://strfry.iris.to/',
|
||||
'wss://cache2.primal.net/v1',
|
||||
];
|
||||
|
||||
/**
|
||||
* FIXME: make sure to somehow tag encryptedEvents and add asserts, so that it's not possible to accidentally call this function with unencrypted events
|
||||
*
|
||||
* @param encryptedEvent the event to publish to the relay
|
||||
*/
|
||||
export async function publishToRelays(
|
||||
encryptedEvent: nostrTools.Event | nostrTools.Event[],
|
||||
): Promise<void> {
|
||||
if (isArray(encryptedEvent)) {
|
||||
for (const chunk of encryptedEvent) {
|
||||
await Promise.any(pool.publish(relays, chunk));
|
||||
}
|
||||
} else {
|
||||
await Promise.any(pool.publish(relays, encryptedEvent));
|
||||
}
|
||||
}
|
8
src/utils/cleanupOldChunks.ts
Normal file
8
src/utils/cleanupOldChunks.ts
Normal file
|
@ -0,0 +1,8 @@
|
|||
import type { Database } from 'jsr:@db/sqlite';
|
||||
import { CHUNK_MAX_AGE } from '../consts.ts';
|
||||
import { sql } from './queries.ts';
|
||||
|
||||
export function cleanupOldChunks(db: Database) {
|
||||
const cutoffTime = Math.floor((Date.now() - CHUNK_MAX_AGE) / 1000);
|
||||
sql`DELETE FROM event_chunks WHERE created_at < ${cutoffTime}`(db);
|
||||
}
|
52
src/utils/createNewCCN.ts
Normal file
52
src/utils/createNewCCN.ts
Normal file
|
@ -0,0 +1,52 @@
|
|||
import { encodeBase64 } from 'jsr:@std/encoding@~0.224.1/base64';
|
||||
import type { Database } from '@db/sqlite';
|
||||
import * as nostrTools from '@nostr/tools';
|
||||
import * as nip06 from '@nostr/tools/nip06';
|
||||
import { encryptUint8Array, encryptionKey } from './encryption.ts';
|
||||
import { getEveFilePath } from './files.ts';
|
||||
import { sql } from './queries.ts';
|
||||
|
||||
/**
|
||||
* Create a new CCN and store it in the database
|
||||
*
|
||||
* @param db - The database instance
|
||||
* @param name - The name of the CCN
|
||||
* @param seed - The seed words for the CCN
|
||||
* @returns The public key and private key of the CCN
|
||||
*/
|
||||
|
||||
export async function createNewCCN(
|
||||
db: Database,
|
||||
name: string,
|
||||
creator: string,
|
||||
seed?: string,
|
||||
): Promise<{ pubkey: string; privkey: Uint8Array }> {
|
||||
const ccnSeed = seed || nip06.generateSeedWords();
|
||||
const ccnPrivateKey = nip06.privateKeyFromSeedWords(ccnSeed);
|
||||
const ccnPublicKey = nostrTools.getPublicKey(ccnPrivateKey);
|
||||
|
||||
const ccnSeedPath = await getEveFilePath(`ccn_seeds/${ccnPublicKey}`);
|
||||
const ccnPrivPath = await getEveFilePath(`ccn_keys/${ccnPublicKey}`);
|
||||
|
||||
await Deno.mkdir(await getEveFilePath('ccn_seeds'), { recursive: true });
|
||||
await Deno.mkdir(await getEveFilePath('ccn_keys'), { recursive: true });
|
||||
|
||||
const encryptedPrivateKey = encryptUint8Array(ccnPrivateKey, encryptionKey);
|
||||
|
||||
Deno.writeTextFileSync(ccnSeedPath, ccnSeed);
|
||||
Deno.writeTextFileSync(ccnPrivPath, encodeBase64(encryptedPrivateKey));
|
||||
|
||||
db.run('BEGIN TRANSACTION');
|
||||
|
||||
sql`INSERT INTO ccns (pubkey, name) VALUES (${ccnPublicKey}, ${name})`(db);
|
||||
sql`INSERT INTO allowed_writes (ccn_pubkey, pubkey) VALUES (${ccnPublicKey}, ${creator})`(
|
||||
db,
|
||||
);
|
||||
|
||||
db.run('COMMIT TRANSACTION');
|
||||
|
||||
return {
|
||||
pubkey: ccnPublicKey,
|
||||
privkey: ccnPrivateKey,
|
||||
};
|
||||
}
|
217
src/utils/databaseLogger.ts
Normal file
217
src/utils/databaseLogger.ts
Normal file
|
@ -0,0 +1,217 @@
|
|||
import type { Database, Statement } from '@db/sqlite';
|
||||
import * as log from '@std/log';
|
||||
|
||||
interface DatabaseHandlerOptions extends log.BaseHandlerOptions {
|
||||
db?: Database;
|
||||
}
|
||||
|
||||
export class DatabaseHandler extends log.BaseHandler {
|
||||
private db: Database | null = null;
|
||||
private insertStmt: Statement | null = null;
|
||||
|
||||
constructor(levelName: log.LevelName, options: DatabaseHandlerOptions = {}) {
|
||||
super(levelName, options);
|
||||
if (options.db) {
|
||||
this.setDatabase(options.db);
|
||||
}
|
||||
}
|
||||
|
||||
setDatabase(db: Database): void {
|
||||
this.db = db;
|
||||
try {
|
||||
this.insertStmt = this.db.prepare(`
|
||||
INSERT INTO logs (timestamp, level, message, args, source, event_type, severity, remote_addr, ccn_pubkey, event_id, risk_score)
|
||||
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
|
||||
`);
|
||||
} catch (error) {
|
||||
console.warn('Database logger not ready:', error);
|
||||
}
|
||||
}
|
||||
|
||||
override log(msg: string): void {
|
||||
// Required by the abstract base class but not used in our implementation
|
||||
}
|
||||
|
||||
override handle(logRecord: log.LogRecord): void {
|
||||
if (this.shouldSkipLogging(logRecord)) {
|
||||
return;
|
||||
}
|
||||
|
||||
if (!this.db || !this.insertStmt) {
|
||||
return;
|
||||
}
|
||||
|
||||
try {
|
||||
const timestamp = new Date(logRecord.datetime).toISOString();
|
||||
const level = this.getLevelName(logRecord.level);
|
||||
const message = logRecord.msg;
|
||||
|
||||
const securityData = this.extractSecurityData(logRecord.args);
|
||||
const sanitizedArgs = this.sanitizeArgs(logRecord.args);
|
||||
const argsJson =
|
||||
sanitizedArgs.length > 0 ? JSON.stringify(sanitizedArgs) : null;
|
||||
const source = this.extractSource(logRecord.args);
|
||||
|
||||
this.insertStmt.run(
|
||||
timestamp,
|
||||
level,
|
||||
message,
|
||||
argsJson,
|
||||
source,
|
||||
securityData.eventType,
|
||||
securityData.severity,
|
||||
securityData.remoteAddr,
|
||||
securityData.ccnPubkey,
|
||||
securityData.eventId,
|
||||
securityData.riskScore,
|
||||
);
|
||||
} catch (error) {
|
||||
console.error('Failed to write log to database:', error);
|
||||
}
|
||||
}
|
||||
|
||||
private getLevelName(level: number): string {
|
||||
switch (level) {
|
||||
case 10:
|
||||
return 'DEBUG';
|
||||
case 20:
|
||||
return 'INFO';
|
||||
case 30:
|
||||
return 'WARN';
|
||||
case 40:
|
||||
return 'ERROR';
|
||||
case 50:
|
||||
return 'FATAL';
|
||||
default:
|
||||
return `LVL${level}`;
|
||||
}
|
||||
}
|
||||
|
||||
private shouldSkipLogging(logRecord: log.LogRecord): boolean {
|
||||
const message = logRecord.msg.toLowerCase();
|
||||
|
||||
if (
|
||||
message.includes('sql') ||
|
||||
message.includes('database') ||
|
||||
message.includes('migration') ||
|
||||
message.includes('sqlite')
|
||||
) {
|
||||
return true;
|
||||
}
|
||||
|
||||
if (message.includes('log') && message.includes('database')) {
|
||||
return true;
|
||||
}
|
||||
|
||||
return false;
|
||||
}
|
||||
|
||||
private extractSecurityData(args: unknown[]): {
|
||||
eventType: string | null;
|
||||
severity: string | null;
|
||||
remoteAddr: string | null;
|
||||
ccnPubkey: string | null;
|
||||
eventId: string | null;
|
||||
riskScore: number | null;
|
||||
} {
|
||||
let eventType = null;
|
||||
let severity = null;
|
||||
let remoteAddr = null;
|
||||
let ccnPubkey = null;
|
||||
let eventId = null;
|
||||
let riskScore = null;
|
||||
|
||||
for (const arg of args) {
|
||||
if (typeof arg === 'object' && arg !== null) {
|
||||
const obj = arg as Record<string, unknown>;
|
||||
|
||||
if (obj.eventType && typeof obj.eventType === 'string') {
|
||||
eventType = obj.eventType;
|
||||
}
|
||||
if (obj.severity && typeof obj.severity === 'string') {
|
||||
severity = obj.severity;
|
||||
}
|
||||
if (obj.remoteAddr && typeof obj.remoteAddr === 'string') {
|
||||
remoteAddr = obj.remoteAddr;
|
||||
}
|
||||
if (obj.ccnPubkey && typeof obj.ccnPubkey === 'string') {
|
||||
ccnPubkey = obj.ccnPubkey;
|
||||
}
|
||||
if (obj.eventId && typeof obj.eventId === 'string') {
|
||||
eventId = obj.eventId;
|
||||
}
|
||||
if (obj.risk_score && typeof obj.risk_score === 'number') {
|
||||
riskScore = obj.risk_score;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
return { eventType, severity, remoteAddr, ccnPubkey, eventId, riskScore };
|
||||
}
|
||||
|
||||
private extractSource(args: unknown[]): string | null {
|
||||
for (const arg of args) {
|
||||
if (typeof arg === 'object' && arg !== null) {
|
||||
const obj = arg as Record<string, unknown>;
|
||||
if (obj.tag && typeof obj.tag === 'string') {
|
||||
return obj.tag;
|
||||
}
|
||||
if (obj.source && typeof obj.source === 'string') {
|
||||
return obj.source;
|
||||
}
|
||||
}
|
||||
}
|
||||
return null;
|
||||
}
|
||||
|
||||
private sanitizeArgs(args: unknown[]): unknown[] {
|
||||
const sensitiveKeys = [
|
||||
'privatekey',
|
||||
'private_key',
|
||||
'privkey',
|
||||
'priv_key',
|
||||
'secretkey',
|
||||
'secret_key',
|
||||
'seckey',
|
||||
'sec_key',
|
||||
'password',
|
||||
'pass',
|
||||
'pwd',
|
||||
'token',
|
||||
'auth',
|
||||
'ccnprivatekey',
|
||||
'ccn_private_key',
|
||||
'ccnprivkey',
|
||||
];
|
||||
|
||||
return args.map((arg) => {
|
||||
if (typeof arg === 'object' && arg !== null) {
|
||||
const sanitized: Record<string, unknown> = {};
|
||||
|
||||
for (const [key, value] of Object.entries(
|
||||
arg as Record<string, unknown>,
|
||||
)) {
|
||||
const lowerKey = key.toLowerCase();
|
||||
|
||||
if (
|
||||
sensitiveKeys.some((sensitiveKey) =>
|
||||
lowerKey.includes(sensitiveKey),
|
||||
)
|
||||
) {
|
||||
sanitized[key] = '[REDACTED]';
|
||||
} else if (value instanceof Uint8Array) {
|
||||
sanitized[key] = `[Uint8Array length=${value.length}]`;
|
||||
} else {
|
||||
sanitized[key] = value;
|
||||
}
|
||||
}
|
||||
|
||||
return sanitized;
|
||||
}
|
||||
|
||||
return arg;
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
export const dbHandler: DatabaseHandler | null = null;
|
|
@ -1,7 +1,7 @@
|
|||
import { xchacha20poly1305 } from "@noble/ciphers/chacha";
|
||||
import { managedNonce } from "@noble/ciphers/webcrypto";
|
||||
import { decodeBase64 } from "jsr:@std/encoding/base64";
|
||||
export const encryptionKey = decodeBase64(Deno.env.get("ENCRYPTION_KEY") || "");
|
||||
import { decodeBase64 } from 'jsr:@std/encoding/base64';
|
||||
import { xchacha20poly1305 } from '@noble/ciphers/chacha';
|
||||
import { managedNonce } from '@noble/ciphers/webcrypto';
|
||||
export const encryptionKey = decodeBase64(Deno.env.get('ENCRYPTION_KEY') || '');
|
||||
|
||||
/**
|
||||
* Encrypts a given Uint8Array using the XChaCha20-Poly1305 algorithm.
|
24
src/utils/eventTypes.ts
Normal file
24
src/utils/eventTypes.ts
Normal file
|
@ -0,0 +1,24 @@
|
|||
export function isReplaceableEvent(kind: number): boolean {
|
||||
return (kind >= 10000 && kind < 20000) || kind === 0 || kind === 3;
|
||||
}
|
||||
|
||||
export function isAddressableEvent(kind: number): boolean {
|
||||
return kind >= 30000 && kind < 40000;
|
||||
}
|
||||
|
||||
export function isRegularEvent(kind: number): boolean {
|
||||
return (
|
||||
(kind >= 1000 && kind < 10000) ||
|
||||
(kind >= 4 && kind < 45) ||
|
||||
kind === 1 ||
|
||||
kind === 2
|
||||
);
|
||||
}
|
||||
|
||||
export function isDeleteEvent(kind: number): boolean {
|
||||
return kind === 5;
|
||||
}
|
||||
|
||||
export function isCCNReplaceableEvent(kind: number): boolean {
|
||||
return kind >= 60000 && kind < 65536;
|
||||
}
|
39
src/utils/files.ts
Executable file
39
src/utils/files.ts
Executable file
|
@ -0,0 +1,39 @@
|
|||
import { exists } from 'jsr:@std/fs';
|
||||
|
||||
/**
|
||||
* Return the path to Eve's configuration directory and ensures its existence.
|
||||
*
|
||||
* On macOS, the directory is located at "$HOME/Library/Application Support/eve/arx/Eve".
|
||||
* On other systems, it defaults to "$XDG_CONFIG_HOME/arx/Eve" or
|
||||
* "$HOME/.config/arx/Eve" if XDG_CONFIG_HOME is not set.
|
||||
*
|
||||
* If the directory does not exist, it is created automatically.
|
||||
*
|
||||
* @returns A promise that resolves to the path of the configuration directory.
|
||||
*/
|
||||
|
||||
export async function getEveConfigHome(): Promise<string> {
|
||||
let storagePath: string;
|
||||
if (Deno.build.os === 'darwin') {
|
||||
storagePath = `${Deno.env.get('HOME')}/Library/Application Support/eve/arx/Eve`;
|
||||
} else {
|
||||
const xdgConfigHome =
|
||||
Deno.env.get('XDG_CONFIG_HOME') ?? `${Deno.env.get('HOME')}/.config`;
|
||||
storagePath = `${xdgConfigHome}/arx/Eve`;
|
||||
}
|
||||
if (!(await exists(storagePath))) {
|
||||
await Deno.mkdir(storagePath, { recursive: true });
|
||||
}
|
||||
return storagePath;
|
||||
}
|
||||
|
||||
/**
|
||||
* Return the path to the file in Eve's configuration directory.
|
||||
*
|
||||
* @param file The name of the file to return the path for.
|
||||
* @returns The path to the file in Eve's configuration directory.
|
||||
*/
|
||||
export async function getEveFilePath(file: string): Promise<string> {
|
||||
const storagePath = await getEveConfigHome();
|
||||
return `${storagePath}/${file}`;
|
||||
}
|
32
src/utils/filtersMatchingEvent.ts
Normal file
32
src/utils/filtersMatchingEvent.ts
Normal file
|
@ -0,0 +1,32 @@
|
|||
import type { NostrEvent } from 'jsr:@nostrify/types';
|
||||
import type { UserConnection } from '../UserConnection.ts';
|
||||
|
||||
export function filtersMatchingEvent(
|
||||
event: NostrEvent,
|
||||
connection: UserConnection,
|
||||
): string[] {
|
||||
const matching = [];
|
||||
for (const subscription of connection.subscriptions.keys()) {
|
||||
const filters = connection.subscriptions.get(subscription);
|
||||
if (!filters) continue;
|
||||
const isMatching = filters.some((filter) =>
|
||||
Object.entries(filter).every(([type, value]) => {
|
||||
if (type === 'ids') return value.includes(event.id);
|
||||
if (type === 'kinds') return value.includes(event.kind);
|
||||
if (type === 'authors') return value.includes(event.pubkey);
|
||||
if (type === 'since') return event.created_at > value;
|
||||
if (type === 'until') return event.created_at <= value;
|
||||
if (type === 'limit') return true;
|
||||
if (type.startsWith('#')) {
|
||||
const tagName = type.slice(1);
|
||||
return event.tags.some(
|
||||
(tag: string[]) => tag[0] === tagName && value.includes(tag[1]),
|
||||
);
|
||||
}
|
||||
return false;
|
||||
}),
|
||||
);
|
||||
if (isMatching) matching.push(subscription);
|
||||
}
|
||||
return matching;
|
||||
}
|
17
src/utils/getActiveCCN.ts
Normal file
17
src/utils/getActiveCCN.ts
Normal file
|
@ -0,0 +1,17 @@
|
|||
import type { Database } from '@db/sqlite';
|
||||
import { sql } from './queries.ts';
|
||||
|
||||
/**
|
||||
* Get the single active CCN from the database
|
||||
* @returns The active CCN or null if none is active
|
||||
*/
|
||||
export function getActiveCCN(
|
||||
db: Database,
|
||||
): { pubkey: string; name: string } | null {
|
||||
const result = sql`SELECT pubkey, name FROM ccns WHERE is_active = 1 LIMIT 1`(
|
||||
db,
|
||||
);
|
||||
return result.length > 0
|
||||
? (result[0] as { pubkey: string; name: string })
|
||||
: null;
|
||||
}
|
13
src/utils/getAllCCNs.ts
Normal file
13
src/utils/getAllCCNs.ts
Normal file
|
@ -0,0 +1,13 @@
|
|||
import type { Database } from '@db/sqlite';
|
||||
import { sql } from './queries.ts';
|
||||
|
||||
/**
|
||||
* Get all CCNs from the database
|
||||
*/
|
||||
|
||||
export function getAllCCNs(db: Database): { pubkey: string; name: string }[] {
|
||||
return sql`SELECT pubkey, name FROM ccns`(db) as {
|
||||
pubkey: string;
|
||||
name: string;
|
||||
}[];
|
||||
}
|
20
src/utils/getCCNPrivateKeyByPubkey.ts
Normal file
20
src/utils/getCCNPrivateKeyByPubkey.ts
Normal file
|
@ -0,0 +1,20 @@
|
|||
import { decodeBase64 } from 'jsr:@std/encoding@~0.224.1/base64';
|
||||
import { exists } from 'jsr:@std/fs/exists';
|
||||
import { decryptUint8Array, encryptionKey } from './encryption.ts';
|
||||
import { getEveFilePath } from './files.ts';
|
||||
|
||||
/**
|
||||
* Get the private key for a specific CCN
|
||||
*/
|
||||
export async function getCCNPrivateKeyByPubkey(
|
||||
pubkey: string,
|
||||
): Promise<Uint8Array> {
|
||||
const ccnPrivPath = await getEveFilePath(`ccn_keys/${pubkey}`);
|
||||
|
||||
if (await exists(ccnPrivPath)) {
|
||||
const encryptedPrivateKey = Deno.readTextFileSync(ccnPrivPath);
|
||||
return decryptUint8Array(decodeBase64(encryptedPrivateKey), encryptionKey);
|
||||
}
|
||||
|
||||
throw new Error(`CCN private key for ${pubkey} not found`);
|
||||
}
|
12
src/utils/getEncryptedEventByOriginalId.ts
Normal file
12
src/utils/getEncryptedEventByOriginalId.ts
Normal file
|
@ -0,0 +1,12 @@
|
|||
import type { Database } from '@db/sqlite';
|
||||
import type * as nostrTools from '@nostr/tools';
|
||||
import { sql } from '../utils/queries.ts';
|
||||
|
||||
export function getEncryptedEventByOriginalId(
|
||||
db: Database,
|
||||
event: nostrTools.VerifiedEvent,
|
||||
) {
|
||||
return sql`
|
||||
SELECT * FROM events WHERE original_id = ${event.id}
|
||||
`(db)[0];
|
||||
}
|
76
src/utils/invites.ts
Normal file
76
src/utils/invites.ts
Normal file
|
@ -0,0 +1,76 @@
|
|||
import type { Database } from 'jsr:@db/sqlite';
|
||||
import { bytesToHex } from '@noble/ciphers/utils';
|
||||
import { nip19 } from '@nostr/tools';
|
||||
import { bech32m } from '@scure/base';
|
||||
import { sql } from './queries.ts';
|
||||
|
||||
export class InviteTree {
|
||||
public root: InviteNode;
|
||||
|
||||
constructor(npub: string) {
|
||||
this.root = new InviteNode(npub);
|
||||
}
|
||||
|
||||
public addChild(npub: string) {
|
||||
const child = new InviteNode(npub);
|
||||
this.root.children.push(child);
|
||||
}
|
||||
}
|
||||
|
||||
export class InviteNode {
|
||||
public readonly npub: string;
|
||||
public readonly children: InviteNode[];
|
||||
|
||||
constructor(npub: string) {
|
||||
this.npub = npub;
|
||||
this.children = [];
|
||||
}
|
||||
|
||||
public addChild(npub: string) {
|
||||
const child = new InviteNode(npub);
|
||||
this.children.push(child);
|
||||
}
|
||||
}
|
||||
|
||||
export function buildInviteTree(db: Database, ccnPubkey: string) {
|
||||
const ccnCreator = sql`
|
||||
SELECT pubkey FROM allowed_writes WHERE ccn_pubkey = ${ccnPubkey} AND pubkey NOT IN (
|
||||
SELECT invitee_pubkey FROM inviter_invitee WHERE ccn_pubkey = ${ccnPubkey}
|
||||
)
|
||||
`(db)[0]?.pubkey;
|
||||
|
||||
if (!ccnCreator) {
|
||||
throw new Error('CCN creator not found');
|
||||
}
|
||||
|
||||
const inviteTree = new InviteTree(ccnCreator);
|
||||
|
||||
const invitees = sql`
|
||||
SELECT inviter_pubkey, invitee_pubkey FROM inviter_invitee WHERE ccn_pubkey = ${ccnPubkey}
|
||||
`(db);
|
||||
|
||||
// populate the invite tree by traversing the inviters
|
||||
for (const invitee of invitees) {
|
||||
let inviterNode = inviteTree.root.children.find(
|
||||
(child) => child.npub === invitee.inviter_pubkey,
|
||||
);
|
||||
|
||||
if (!inviterNode) {
|
||||
inviterNode = new InviteNode(invitee.inviter_pubkey);
|
||||
inviteTree.root.children.push(inviterNode);
|
||||
}
|
||||
|
||||
inviterNode.addChild(invitee.invitee_pubkey);
|
||||
}
|
||||
|
||||
return inviteTree;
|
||||
}
|
||||
|
||||
export function readInvite(invite: `${string}1${string}`) {
|
||||
const decoded = bech32m.decode(invite, false);
|
||||
if (decoded.prefix !== 'eveinvite') return false;
|
||||
const hexBytes = bech32m.fromWords(decoded.words);
|
||||
const npub = nip19.npubEncode(bytesToHex(hexBytes.slice(0, 32)));
|
||||
const inviteCode = bytesToHex(hexBytes.slice(32));
|
||||
return { npub, invite: inviteCode };
|
||||
}
|
3
src/utils/isArray.ts
Normal file
3
src/utils/isArray.ts
Normal file
|
@ -0,0 +1,3 @@
|
|||
export function isArray<T>(obj: unknown): obj is T[] {
|
||||
return Array.isArray(obj);
|
||||
}
|
39
src/utils/isLocalhost.ts
Normal file
39
src/utils/isLocalhost.ts
Normal file
|
@ -0,0 +1,39 @@
|
|||
export function isLocalhost(
|
||||
req: Request,
|
||||
connInfo?: Deno.ServeHandlerInfo,
|
||||
): boolean {
|
||||
if (connInfo?.remoteAddr) {
|
||||
const remoteAddr = connInfo.remoteAddr;
|
||||
if (remoteAddr.transport === 'tcp') {
|
||||
const hostname = remoteAddr.hostname;
|
||||
return hostname === '127.0.0.1' || hostname === '::1';
|
||||
}
|
||||
if (remoteAddr.transport === 'unix') {
|
||||
return true;
|
||||
}
|
||||
}
|
||||
const url = new URL(req.url);
|
||||
const hostname = url.hostname;
|
||||
if (hostname === '127.0.0.1' || hostname === '::1') {
|
||||
return true;
|
||||
}
|
||||
if (hostname === 'localhost') {
|
||||
const suspiciousHeaders = [
|
||||
'x-forwarded-for',
|
||||
'x-forwarded-host',
|
||||
'x-real-ip',
|
||||
'cf-connecting-ip',
|
||||
'x-cluster-client-ip',
|
||||
];
|
||||
|
||||
for (const header of suspiciousHeaders) {
|
||||
if (req.headers.get(header)) {
|
||||
return false;
|
||||
}
|
||||
}
|
||||
|
||||
return true;
|
||||
}
|
||||
|
||||
return false;
|
||||
}
|
8
src/utils/isValidJSON.ts
Normal file
8
src/utils/isValidJSON.ts
Normal file
|
@ -0,0 +1,8 @@
|
|||
export function isValidJSON(str: string) {
|
||||
try {
|
||||
JSON.parse(str);
|
||||
} catch {
|
||||
return false;
|
||||
}
|
||||
return true;
|
||||
}
|
14
src/utils/knownEventsCache.ts
Normal file
14
src/utils/knownEventsCache.ts
Normal file
|
@ -0,0 +1,14 @@
|
|||
import { db } from '../index.ts';
|
||||
import { sql } from './queries.ts';
|
||||
|
||||
let knownOriginalEventsCache: string[] = [];
|
||||
|
||||
export function isOriginalEventIdCached(eventId: string) {
|
||||
return knownOriginalEventsCache.includes(eventId);
|
||||
}
|
||||
|
||||
export function updateKnownEventsCache() {
|
||||
knownOriginalEventsCache = sql`SELECT original_id FROM events`(db).flatMap(
|
||||
(row) => row.original_id,
|
||||
);
|
||||
}
|
162
src/utils/logQueries.ts
Normal file
162
src/utils/logQueries.ts
Normal file
|
@ -0,0 +1,162 @@
|
|||
import type { Database } from '@db/sqlite';
|
||||
import { sql } from './queries.ts';
|
||||
|
||||
export interface LogEntry {
|
||||
log_id: string;
|
||||
timestamp: string;
|
||||
level: string;
|
||||
message: string;
|
||||
args: string | null;
|
||||
source: string | null;
|
||||
created_at: number;
|
||||
event_type: string | null;
|
||||
severity: string | null;
|
||||
remote_addr: string | null;
|
||||
ccn_pubkey: string | null;
|
||||
event_id: string | null;
|
||||
risk_score: number | null;
|
||||
}
|
||||
|
||||
export function getRecentLogs(
|
||||
db: Database,
|
||||
limit = 100,
|
||||
level?: string,
|
||||
): LogEntry[] {
|
||||
if (level) {
|
||||
return sql`
|
||||
SELECT * FROM logs
|
||||
WHERE level = ${level}
|
||||
ORDER BY created_at DESC
|
||||
LIMIT ${limit}
|
||||
`(db) as LogEntry[];
|
||||
}
|
||||
|
||||
return sql`
|
||||
SELECT * FROM logs
|
||||
ORDER BY created_at DESC
|
||||
LIMIT ${limit}
|
||||
`(db) as LogEntry[];
|
||||
}
|
||||
|
||||
export function getSecurityLogs(
|
||||
db: Database,
|
||||
limit = 100,
|
||||
severity?: string,
|
||||
): LogEntry[] {
|
||||
if (severity) {
|
||||
return sql`
|
||||
SELECT * FROM logs
|
||||
WHERE event_type IS NOT NULL AND severity = ${severity}
|
||||
ORDER BY created_at DESC
|
||||
LIMIT ${limit}
|
||||
`(db) as LogEntry[];
|
||||
}
|
||||
|
||||
return sql`
|
||||
SELECT * FROM logs
|
||||
WHERE event_type IS NOT NULL
|
||||
ORDER BY created_at DESC
|
||||
LIMIT ${limit}
|
||||
`(db) as LogEntry[];
|
||||
}
|
||||
|
||||
export function getLogsByTimeRange(
|
||||
db: Database,
|
||||
startTime: number,
|
||||
endTime: number,
|
||||
level?: string,
|
||||
): LogEntry[] {
|
||||
if (level) {
|
||||
return sql`
|
||||
SELECT * FROM logs
|
||||
WHERE created_at >= ${startTime} AND created_at <= ${endTime} AND level = ${level}
|
||||
ORDER BY created_at DESC
|
||||
`(db) as LogEntry[];
|
||||
}
|
||||
|
||||
return sql`
|
||||
SELECT * FROM logs
|
||||
WHERE created_at >= ${startTime} AND created_at <= ${endTime}
|
||||
ORDER BY created_at DESC
|
||||
`(db) as LogEntry[];
|
||||
}
|
||||
|
||||
export function getLogsByCCN(
|
||||
db: Database,
|
||||
ccnPubkey: string,
|
||||
limit = 100,
|
||||
): LogEntry[] {
|
||||
return sql`
|
||||
SELECT * FROM logs
|
||||
WHERE ccn_pubkey = ${ccnPubkey}
|
||||
ORDER BY created_at DESC
|
||||
LIMIT ${limit}
|
||||
`(db) as LogEntry[];
|
||||
}
|
||||
|
||||
export function getHighRiskLogs(
|
||||
db: Database,
|
||||
minRiskScore = 7.0,
|
||||
limit = 50,
|
||||
): LogEntry[] {
|
||||
return sql`
|
||||
SELECT * FROM logs
|
||||
WHERE risk_score >= ${minRiskScore}
|
||||
ORDER BY risk_score DESC, created_at DESC
|
||||
LIMIT ${limit}
|
||||
`(db) as LogEntry[];
|
||||
}
|
||||
|
||||
export function getLogStats(db: Database): {
|
||||
total_logs: number;
|
||||
logs_by_level: Record<string, number>;
|
||||
security_events: number;
|
||||
high_risk_events: number;
|
||||
last_24h_logs: number;
|
||||
} {
|
||||
const totalLogs = sql`SELECT COUNT(*) as count FROM logs`(db)[0].count;
|
||||
|
||||
const logsByLevel = sql`
|
||||
SELECT level, COUNT(*) as count
|
||||
FROM logs
|
||||
GROUP BY level
|
||||
`(db);
|
||||
|
||||
const securityEvents = sql`
|
||||
SELECT COUNT(*) as count
|
||||
FROM logs
|
||||
WHERE event_type IS NOT NULL
|
||||
`(db)[0].count;
|
||||
|
||||
const highRiskEvents = sql`
|
||||
SELECT COUNT(*) as count
|
||||
FROM logs
|
||||
WHERE risk_score >= 7.0
|
||||
`(db)[0].count;
|
||||
|
||||
const last24hLogs = sql`
|
||||
SELECT COUNT(*) as count
|
||||
FROM logs
|
||||
WHERE created_at >= ${Math.floor(Date.now() / 1000) - 86400}
|
||||
`(db)[0].count;
|
||||
|
||||
const levelStats: Record<string, number> = {};
|
||||
for (const row of logsByLevel) {
|
||||
levelStats[row.level] = row.count;
|
||||
}
|
||||
|
||||
return {
|
||||
total_logs: totalLogs,
|
||||
logs_by_level: levelStats,
|
||||
security_events: securityEvents,
|
||||
high_risk_events: highRiskEvents,
|
||||
last_24h_logs: last24hLogs,
|
||||
};
|
||||
}
|
||||
|
||||
export function cleanupOldLogs(db: Database, daysToKeep = 30): number {
|
||||
const cutoffTime = Math.floor(Date.now() / 1000) - daysToKeep * 86400;
|
||||
|
||||
const stmt = db.prepare('DELETE FROM logs WHERE created_at < ?');
|
||||
return stmt.run(cutoffTime);
|
||||
}
|
140
src/utils/logs.ts
Normal file
140
src/utils/logs.ts
Normal file
|
@ -0,0 +1,140 @@
|
|||
import type { Database } from '@db/sqlite';
|
||||
import * as colors from 'jsr:@std/fmt@^1.0.4/colors';
|
||||
import * as log from 'jsr:@std/log';
|
||||
import { DatabaseHandler } from './databaseLogger.ts';
|
||||
import { getEveFilePath } from './files.ts';
|
||||
export * as log from 'jsr:@std/log';
|
||||
|
||||
/**
|
||||
* Sanitizes data before logging to prevent accidental exposure of sensitive information
|
||||
* @param data The data to sanitize
|
||||
* @returns Sanitized data safe for logging
|
||||
*/
|
||||
function sanitizeForLogging(data: unknown): unknown {
|
||||
if (data === null || data === undefined || typeof data !== 'object') {
|
||||
return data;
|
||||
}
|
||||
|
||||
if (data instanceof Uint8Array) {
|
||||
// Never log raw binary data that could contain keys
|
||||
return `[Uint8Array length=${data.length}]`;
|
||||
}
|
||||
|
||||
if (Array.isArray(data)) {
|
||||
return data.map(sanitizeForLogging);
|
||||
}
|
||||
|
||||
const sanitized: Record<string, unknown> = {};
|
||||
const sensitiveKeys = [
|
||||
'privatekey',
|
||||
'private_key',
|
||||
'privkey',
|
||||
'priv_key',
|
||||
'secretkey',
|
||||
'secret_key',
|
||||
'seckey',
|
||||
'sec_key',
|
||||
'password',
|
||||
'pass',
|
||||
'pwd',
|
||||
'token',
|
||||
'auth',
|
||||
'ccnprivatekey',
|
||||
'ccn_private_key',
|
||||
'ccnprivkey',
|
||||
'seed',
|
||||
'seedphrase',
|
||||
'seed_phrase',
|
||||
'mnemonic',
|
||||
'mnemonic_phrase',
|
||||
'mnemonic_phrase_words',
|
||||
];
|
||||
|
||||
for (const [key, value] of Object.entries(data as Record<string, unknown>)) {
|
||||
const lowerKey = key.toLowerCase();
|
||||
|
||||
if (sensitiveKeys.some((sensitiveKey) => lowerKey.includes(sensitiveKey))) {
|
||||
sanitized[key] = '[REDACTED]';
|
||||
} else {
|
||||
sanitized[key] = sanitizeForLogging(value);
|
||||
}
|
||||
}
|
||||
|
||||
return sanitized;
|
||||
}
|
||||
|
||||
export async function setupLogger(db: Database | null) {
|
||||
const formatLevel = (level: number): string => {
|
||||
return (
|
||||
{
|
||||
10: colors.gray('[DEBUG]'),
|
||||
20: colors.green('[INFO] '),
|
||||
30: colors.yellow('[WARN] '),
|
||||
40: colors.red('[ERROR]'),
|
||||
50: colors.bgRed('[FATAL]'),
|
||||
}[level] || `[LVL${level}]`
|
||||
);
|
||||
};
|
||||
|
||||
const levelName = (level: number): string => {
|
||||
return (
|
||||
{
|
||||
10: 'DEBUG',
|
||||
20: 'INFO',
|
||||
30: 'WARN',
|
||||
40: 'ERROR',
|
||||
50: 'FATAL',
|
||||
}[level] || `LVL${level}`
|
||||
);
|
||||
};
|
||||
|
||||
const formatArg = (arg: unknown): string => {
|
||||
const sanitized = sanitizeForLogging(arg);
|
||||
if (typeof sanitized === 'object') return JSON.stringify(sanitized);
|
||||
return String(sanitized);
|
||||
};
|
||||
|
||||
const handlers: Record<string, log.BaseHandler> = {
|
||||
console: new log.ConsoleHandler('DEBUG', {
|
||||
useColors: true,
|
||||
formatter: (record) => {
|
||||
const timestamp = new Date().toISOString();
|
||||
let msg = `${colors.dim(`[${timestamp}]`)} ${formatLevel(record.level)} ${record.msg}`;
|
||||
|
||||
if (record.args.length > 0) {
|
||||
const args = record.args
|
||||
.map((arg, i) => `${colors.dim(`arg${i}:`)} ${formatArg(arg)}`)
|
||||
.join(' ');
|
||||
msg += ` ${colors.dim('|')} ${args}`;
|
||||
}
|
||||
|
||||
return msg;
|
||||
},
|
||||
}),
|
||||
file: new log.FileHandler('DEBUG', {
|
||||
filename:
|
||||
Deno.env.get('LOG_FILE') || (await getEveFilePath('eve-logs.jsonl')),
|
||||
formatter: (record) => {
|
||||
const timestamp = new Date().toISOString();
|
||||
return JSON.stringify({
|
||||
timestamp,
|
||||
level: levelName(record.level),
|
||||
msg: record.msg,
|
||||
args: record.args.map(sanitizeForLogging),
|
||||
});
|
||||
},
|
||||
}),
|
||||
};
|
||||
if (db) {
|
||||
handlers.database = new DatabaseHandler('DEBUG', { db });
|
||||
}
|
||||
log.setup({
|
||||
handlers,
|
||||
loggers: {
|
||||
default: {
|
||||
level: 'DEBUG',
|
||||
handlers: ['console', 'file', 'database'],
|
||||
},
|
||||
},
|
||||
});
|
||||
}
|
40
src/utils/option.ts
Normal file
40
src/utils/option.ts
Normal file
|
@ -0,0 +1,40 @@
|
|||
export type Option<T> =
|
||||
| {
|
||||
value: T;
|
||||
isSome: true;
|
||||
}
|
||||
| {
|
||||
value: undefined;
|
||||
isSome: false;
|
||||
};
|
||||
|
||||
export function Some<T>(value: T): Option<T> {
|
||||
return { value, isSome: true };
|
||||
}
|
||||
|
||||
export function None<T>(): Option<T> {
|
||||
return { value: undefined, isSome: false };
|
||||
}
|
||||
|
||||
export function map<T, U>(option: Option<T>, fn: (value: T) => U): Option<U> {
|
||||
return option.isSome ? Some(fn(option.value)) : None();
|
||||
}
|
||||
|
||||
export function flatMap<T, U>(
|
||||
option: Option<T>,
|
||||
fn: (value: T) => Option<U>,
|
||||
): Option<U> {
|
||||
return option.isSome ? fn(option.value) : None();
|
||||
}
|
||||
|
||||
export function getOrElse<T>(option: Option<T>, defaultValue: T): T {
|
||||
return option.isSome ? option.value : defaultValue;
|
||||
}
|
||||
|
||||
export function fold<T, U>(
|
||||
option: Option<T>,
|
||||
onNone: () => U,
|
||||
onSome: (value: T) => U,
|
||||
): U {
|
||||
return option.isSome ? onSome(option.value) : onNone();
|
||||
}
|
389
src/utils/outboundQueue.ts
Normal file
389
src/utils/outboundQueue.ts
Normal file
|
@ -0,0 +1,389 @@
|
|||
import type * as nostrTools from '@nostr/tools';
|
||||
import type { Database } from 'jsr:@db/sqlite';
|
||||
import { MAX_TRANSMISSION_ATTEMPTS } from '../consts.ts';
|
||||
import { publishToRelays, relays } from '../relays.ts';
|
||||
import { log } from './logs.ts';
|
||||
import { sql } from './queries.ts';
|
||||
import {
|
||||
SecurityEventType,
|
||||
SecuritySeverity,
|
||||
logSecurityEvent,
|
||||
} from './securityLogs.ts';
|
||||
|
||||
export interface QueuedEvent {
|
||||
queue_id: number;
|
||||
event_id: string;
|
||||
encrypted_event: string;
|
||||
ccn_pubkey: string;
|
||||
created_at: number;
|
||||
attempts: number;
|
||||
last_attempt: number | null;
|
||||
status: 'pending' | 'sending' | 'sent' | 'failed' | 'stale';
|
||||
error_message: string | null;
|
||||
}
|
||||
|
||||
let isConnectedToRelays = false;
|
||||
let lastSuccessfulTransmission = 0;
|
||||
let consecutiveFailures = 0;
|
||||
|
||||
export function queueEventForTransmission(
|
||||
db: Database,
|
||||
eventId: string,
|
||||
encryptedEvent: nostrTools.VerifiedEvent | nostrTools.VerifiedEvent[],
|
||||
ccnPubkey: string,
|
||||
): void {
|
||||
try {
|
||||
const encryptedEventJson = JSON.stringify(encryptedEvent);
|
||||
|
||||
sql`
|
||||
INSERT OR REPLACE INTO outbound_event_queue
|
||||
(event_id, encrypted_event, ccn_pubkey, status)
|
||||
VALUES (${eventId}, ${encryptedEventJson}, ${ccnPubkey}, 'pending')
|
||||
`(db);
|
||||
|
||||
log.debug('Event queued for transmission', {
|
||||
tag: 'outboundQueue',
|
||||
eventId,
|
||||
ccnPubkey,
|
||||
connectionState: isConnectedToRelays ? 'connected' : 'offline',
|
||||
});
|
||||
|
||||
logSecurityEvent({
|
||||
eventType: SecurityEventType.EVENT_QUEUED_FOR_TRANSMISSION,
|
||||
severity: SecuritySeverity.LOW,
|
||||
source: 'outbound_queue',
|
||||
details: {
|
||||
action: 'event_queued',
|
||||
event_id: eventId,
|
||||
ccn_pubkey: ccnPubkey,
|
||||
is_chunked: Array.isArray(encryptedEvent),
|
||||
connection_state: isConnectedToRelays,
|
||||
},
|
||||
});
|
||||
} catch (error) {
|
||||
log.error('Failed to queue event for transmission', {
|
||||
tag: 'outboundQueue',
|
||||
eventId,
|
||||
error,
|
||||
});
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
export function getPendingEvents(db: Database, limit = 50): QueuedEvent[] {
|
||||
const now = Math.floor(Date.now() / 1000);
|
||||
const maxAge = 30 * 24 * 60 * 60;
|
||||
|
||||
sql`
|
||||
UPDATE outbound_event_queue
|
||||
SET status = 'stale'
|
||||
WHERE status IN ('pending', 'failed')
|
||||
AND created_at < ${now - maxAge}
|
||||
`(db);
|
||||
|
||||
return sql`
|
||||
SELECT * FROM outbound_event_queue
|
||||
WHERE (
|
||||
status = 'pending'
|
||||
OR (
|
||||
status = 'failed'
|
||||
AND attempts < ${MAX_TRANSMISSION_ATTEMPTS}
|
||||
AND (
|
||||
last_attempt IS NULL OR
|
||||
last_attempt < ${now - getAdaptiveRetryDelay(consecutiveFailures)}
|
||||
)
|
||||
)
|
||||
)
|
||||
AND status != 'stale'
|
||||
ORDER BY
|
||||
CASE WHEN status = 'pending' THEN 0 ELSE 1 END,
|
||||
created_at ASC
|
||||
LIMIT ${limit}
|
||||
`(db) as QueuedEvent[];
|
||||
}
|
||||
|
||||
function getAdaptiveRetryDelay(failures: number): number {
|
||||
return Math.min(300, 2 ** failures * 10);
|
||||
}
|
||||
|
||||
async function checkRelayConnectivity(): Promise<boolean> {
|
||||
const connectivityTests = relays.map(async (relay) => {
|
||||
try {
|
||||
const ws = new WebSocket(relay);
|
||||
|
||||
return new Promise<boolean>((resolve) => {
|
||||
const timeout = setTimeout(() => {
|
||||
ws.close();
|
||||
resolve(false);
|
||||
}, 3000); // 3 second timeout
|
||||
|
||||
ws.onopen = () => {
|
||||
clearTimeout(timeout);
|
||||
ws.close();
|
||||
resolve(true);
|
||||
};
|
||||
|
||||
ws.onerror = () => {
|
||||
clearTimeout(timeout);
|
||||
resolve(false);
|
||||
};
|
||||
|
||||
ws.onclose = () => {
|
||||
clearTimeout(timeout);
|
||||
resolve(false);
|
||||
};
|
||||
});
|
||||
} catch {
|
||||
return false;
|
||||
}
|
||||
});
|
||||
|
||||
try {
|
||||
const results = await Promise.allSettled(connectivityTests);
|
||||
const successfulConnections = results.filter(
|
||||
(result) => result.status === 'fulfilled' && result.value === true,
|
||||
).length;
|
||||
|
||||
const isConnected = successfulConnections > 0;
|
||||
|
||||
log.debug('Relay connectivity check completed', {
|
||||
tag: 'outboundQueue',
|
||||
successfulConnections,
|
||||
totalTested: relays.length,
|
||||
isConnected,
|
||||
});
|
||||
|
||||
return isConnected;
|
||||
} catch {
|
||||
return false;
|
||||
}
|
||||
}
|
||||
|
||||
export async function processStartupQueue(db: Database): Promise<void> {
|
||||
const startupEvents = sql`
|
||||
SELECT COUNT(*) as count FROM outbound_event_queue
|
||||
WHERE status IN ('pending', 'failed', 'sending')
|
||||
`(db)[0].count;
|
||||
|
||||
if (startupEvents > 0) {
|
||||
log.info(`Found ${startupEvents} events from previous session`, {
|
||||
tag: 'outboundQueue',
|
||||
});
|
||||
|
||||
sql`
|
||||
UPDATE outbound_event_queue
|
||||
SET status = 'failed',
|
||||
attempts = attempts + 1,
|
||||
error_message = 'Interrupted by shutdown'
|
||||
WHERE status = 'sending'
|
||||
`(db);
|
||||
|
||||
await processOutboundQueue(db);
|
||||
}
|
||||
}
|
||||
|
||||
export function markEventSending(db: Database, queueId: number): void {
|
||||
const now = Math.floor(Date.now() / 1000);
|
||||
|
||||
sql`
|
||||
UPDATE outbound_event_queue
|
||||
SET status = 'sending', last_attempt = ${now}
|
||||
WHERE queue_id = ${queueId}
|
||||
`(db);
|
||||
}
|
||||
|
||||
export function markEventSent(db: Database, queueId: number): void {
|
||||
sql`
|
||||
UPDATE outbound_event_queue
|
||||
SET status = 'sent'
|
||||
WHERE queue_id = ${queueId}
|
||||
`(db);
|
||||
|
||||
isConnectedToRelays = true;
|
||||
lastSuccessfulTransmission = Math.floor(Date.now() / 1000);
|
||||
consecutiveFailures = 0;
|
||||
|
||||
log.debug('Event marked as sent', {
|
||||
tag: 'outboundQueue',
|
||||
queueId,
|
||||
});
|
||||
}
|
||||
|
||||
export function markEventFailed(
|
||||
db: Database,
|
||||
queueId: number,
|
||||
errorMessage: string,
|
||||
): void {
|
||||
sql`
|
||||
UPDATE outbound_event_queue
|
||||
SET status = 'failed',
|
||||
attempts = attempts + 1,
|
||||
error_message = ${errorMessage}
|
||||
WHERE queue_id = ${queueId}
|
||||
`(db);
|
||||
|
||||
consecutiveFailures++;
|
||||
const timeSinceLastSuccess =
|
||||
Math.floor(Date.now() / 1000) - lastSuccessfulTransmission;
|
||||
|
||||
if (consecutiveFailures >= 3 || timeSinceLastSuccess > 300) {
|
||||
isConnectedToRelays = false;
|
||||
}
|
||||
|
||||
log.warn('Event transmission failed', {
|
||||
tag: 'outboundQueue',
|
||||
queueId,
|
||||
errorMessage,
|
||||
consecutiveFailures,
|
||||
timeSinceLastSuccess,
|
||||
connectionState: isConnectedToRelays ? 'connected' : 'offline',
|
||||
});
|
||||
|
||||
logSecurityEvent({
|
||||
eventType: SecurityEventType.SYSTEM_STARTUP,
|
||||
severity: SecuritySeverity.MEDIUM,
|
||||
source: 'outbound_queue',
|
||||
details: {
|
||||
action: 'transmission_failed',
|
||||
queue_id: queueId,
|
||||
error_message: errorMessage,
|
||||
consecutive_failures: consecutiveFailures,
|
||||
},
|
||||
});
|
||||
}
|
||||
|
||||
async function batchProcessEvents(
|
||||
db: Database,
|
||||
events: QueuedEvent[],
|
||||
): Promise<void> {
|
||||
const BATCH_SIZE = 10;
|
||||
const batches = [];
|
||||
|
||||
for (let i = 0; i < events.length; i += BATCH_SIZE) {
|
||||
batches.push(events.slice(i, i + BATCH_SIZE));
|
||||
}
|
||||
|
||||
for (const batch of batches) {
|
||||
const promises = batch.map(async (queuedEvent) => {
|
||||
try {
|
||||
const encryptedEvent = JSON.parse(queuedEvent.encrypted_event);
|
||||
await publishToRelays(encryptedEvent);
|
||||
return { queueId: queuedEvent.queue_id, success: true };
|
||||
} catch (error) {
|
||||
const errorMessage =
|
||||
error instanceof Error ? error.message : 'Unknown error';
|
||||
return {
|
||||
queueId: queuedEvent.queue_id,
|
||||
success: false,
|
||||
error: errorMessage,
|
||||
};
|
||||
}
|
||||
});
|
||||
|
||||
const results = await Promise.allSettled(promises);
|
||||
|
||||
results.forEach((result, index) => {
|
||||
const queuedEvent = batch[index];
|
||||
if (result.status === 'fulfilled' && result.value.success) {
|
||||
markEventSent(db, queuedEvent.queue_id);
|
||||
} else {
|
||||
const error =
|
||||
result.status === 'fulfilled'
|
||||
? result.value.error
|
||||
: 'Promise rejected';
|
||||
markEventFailed(db, queuedEvent.queue_id, error || 'Unknown error');
|
||||
}
|
||||
});
|
||||
|
||||
if (batches.indexOf(batch) < batches.length - 1) {
|
||||
await new Promise((resolve) => setTimeout(resolve, 1000));
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
export async function processOutboundQueue(db: Database): Promise<void> {
|
||||
const pendingEvents = getPendingEvents(db);
|
||||
|
||||
if (pendingEvents.length === 0) {
|
||||
return;
|
||||
}
|
||||
|
||||
const connectivityResult = await checkRelayConnectivity();
|
||||
if (!connectivityResult && consecutiveFailures > 5) {
|
||||
log.debug('Skipping queue processing - appears to be offline', {
|
||||
tag: 'outboundQueue',
|
||||
consecutiveFailures,
|
||||
});
|
||||
return;
|
||||
}
|
||||
|
||||
log.info(`Processing ${pendingEvents.length} pending events`, {
|
||||
tag: 'outboundQueue',
|
||||
connectionState: isConnectedToRelays ? 'connected' : 'unknown',
|
||||
});
|
||||
|
||||
for (const event of pendingEvents) {
|
||||
markEventSending(db, event.queue_id);
|
||||
}
|
||||
|
||||
await batchProcessEvents(db, pendingEvents);
|
||||
}
|
||||
|
||||
export function getQueueStats(db: Database): {
|
||||
pending: number;
|
||||
sending: number;
|
||||
sent: number;
|
||||
failed: number;
|
||||
stale: number;
|
||||
total: number;
|
||||
connectionState: string;
|
||||
consecutiveFailures: number;
|
||||
} {
|
||||
const stats = sql`
|
||||
SELECT
|
||||
status,
|
||||
COUNT(*) as count
|
||||
FROM outbound_event_queue
|
||||
GROUP BY status
|
||||
`(db);
|
||||
|
||||
const result = {
|
||||
pending: 0,
|
||||
sending: 0,
|
||||
sent: 0,
|
||||
failed: 0,
|
||||
stale: 0,
|
||||
total: 0,
|
||||
connectionState: isConnectedToRelays ? 'connected' : 'offline',
|
||||
consecutiveFailures,
|
||||
};
|
||||
|
||||
for (const stat of stats) {
|
||||
const status = stat.status as
|
||||
| 'pending'
|
||||
| 'sending'
|
||||
| 'sent'
|
||||
| 'failed'
|
||||
| 'stale';
|
||||
switch (status) {
|
||||
case 'pending':
|
||||
result.pending = stat.count;
|
||||
break;
|
||||
case 'sending':
|
||||
result.sending = stat.count;
|
||||
break;
|
||||
case 'sent':
|
||||
result.sent = stat.count;
|
||||
break;
|
||||
case 'failed':
|
||||
result.failed = stat.count;
|
||||
break;
|
||||
case 'stale':
|
||||
result.stale = stat.count;
|
||||
break;
|
||||
}
|
||||
result.total += stat.count;
|
||||
}
|
||||
|
||||
return result;
|
||||
}
|
14
src/utils/parseATagQuery.ts
Normal file
14
src/utils/parseATagQuery.ts
Normal file
|
@ -0,0 +1,14 @@
|
|||
export function parseATagQuery(aTagValue: string): {
|
||||
kind: number;
|
||||
pubkey: string;
|
||||
dTag?: string;
|
||||
} {
|
||||
const parts = aTagValue.split(':');
|
||||
if (parts.length < 2) return { kind: 0, pubkey: '' };
|
||||
|
||||
return {
|
||||
kind: Number.parseInt(parts[0], 10),
|
||||
pubkey: parts[1],
|
||||
dTag: parts.length > 2 ? parts[2] : undefined,
|
||||
};
|
||||
}
|
|
@ -1,4 +1,4 @@
|
|||
import type { BindValue, Database } from "@db/sqlite";
|
||||
import type { BindValue, Database } from '@db/sqlite';
|
||||
|
||||
/**
|
||||
* Construct a SQL query with placeholders for values.
|
||||
|
@ -23,8 +23,8 @@ export function sqlPartial(
|
|||
) {
|
||||
return {
|
||||
query: segments.reduce(
|
||||
(acc, str, i) => acc + str + (i < values.length ? "?" : ""),
|
||||
"",
|
||||
(acc, str, i) => acc + str + (i < values.length ? '?' : ''),
|
||||
'',
|
||||
),
|
||||
values: values,
|
||||
};
|
||||
|
@ -72,7 +72,7 @@ export function mixQuery(...queries: { query: string; values: BindValue[] }[]) {
|
|||
query: `${acc.query} ${query}`,
|
||||
values: [...acc.values, ...values],
|
||||
}),
|
||||
{ query: "", values: [] },
|
||||
{ query: '', values: [] },
|
||||
);
|
||||
return { query, values };
|
||||
}
|
7
src/utils/randomTimeUpTo2DaysInThePast.ts
Normal file
7
src/utils/randomTimeUpTo2DaysInThePast.ts
Normal file
|
@ -0,0 +1,7 @@
|
|||
export function randomTimeUpTo2DaysInThePast() {
|
||||
const now = Date.now();
|
||||
const twoDaysAgo = now - 2 * 24 * 60 * 60 * 1000 - 3600 * 1000; // 1 hour buffer in case of clock skew
|
||||
return Math.floor(
|
||||
(Math.floor(Math.random() * (now - twoDaysAgo)) + twoDaysAgo) / 1000,
|
||||
);
|
||||
}
|
220
src/utils/securityLogs.ts
Normal file
220
src/utils/securityLogs.ts
Normal file
|
@ -0,0 +1,220 @@
|
|||
import { log } from './logs.ts';
|
||||
|
||||
export enum SecurityEventType {
|
||||
// Authentication & Authorization
|
||||
CCN_ACCESS_DENIED = 'ccn_access_denied',
|
||||
CCN_ACTIVATION_ATTEMPT = 'ccn_activation_attempt',
|
||||
CCN_CREATION_ATTEMPT = 'ccn_creation_attempt',
|
||||
UNAUTHORIZED_WRITE_ATTEMPT = 'unauthorized_write_attempt',
|
||||
|
||||
// Connection Security
|
||||
NON_LOCALHOST_CONNECTION_BLOCKED = 'non_localhost_connection_blocked',
|
||||
SUSPICIOUS_HEADER_DETECTED = 'suspicious_header_detected',
|
||||
WEBSOCKET_CONNECTION_ESTABLISHED = 'websocket_connection_established',
|
||||
WEBSOCKET_CONNECTION_CLOSED = 'websocket_connection_closed',
|
||||
|
||||
// Cryptographic Operations
|
||||
DECRYPTION_FAILURE = 'decryption_failure',
|
||||
INVALID_SIGNATURE = 'invalid_signature',
|
||||
POW_VALIDATION_FAILURE = 'pow_validation_failure',
|
||||
ENCRYPTION_ERROR = 'encryption_error',
|
||||
|
||||
// Event Processing
|
||||
DUPLICATE_EVENT_BLOCKED = 'duplicate_event_blocked',
|
||||
MALFORMED_EVENT = 'malformed_event',
|
||||
CHUNKED_EVENT_RECEIVED = 'chunked_event_received',
|
||||
CHUNKED_EVENT_COMPLETED = 'chunked_event_completed',
|
||||
EVENT_QUEUED_FOR_TRANSMISSION = 'event_queued_for_transmission',
|
||||
|
||||
// Resource Usage & DoS Protection
|
||||
SUBSCRIPTION_LIMIT_EXCEEDED = 'subscription_limit_exceeded',
|
||||
MEMORY_USAGE_HIGH = 'memory_usage_high',
|
||||
LARGE_PAYLOAD_DETECTED = 'large_payload_detected',
|
||||
|
||||
// Database Security
|
||||
SQL_QUERY_EXECUTED = 'sql_query_executed',
|
||||
MIGRATION_EXECUTED = 'migration_executed',
|
||||
TRANSACTION_ROLLBACK = 'transaction_rollback',
|
||||
|
||||
// CCN Boundary Violations
|
||||
CCN_BOUNDARY_VIOLATION_ATTEMPT = 'ccn_boundary_violation_attempt',
|
||||
INVITE_VALIDATION_FAILURE = 'invite_validation_failure',
|
||||
INVITE_ALREADY_USED = 'invite_already_used',
|
||||
|
||||
// System Events
|
||||
SYSTEM_STARTUP = 'system_startup',
|
||||
SYSTEM_SHUTDOWN = 'system_shutdown',
|
||||
CONFIGURATION_LOADED = 'configuration_loaded',
|
||||
ERROR_THRESHOLD_EXCEEDED = 'error_threshold_exceeded',
|
||||
}
|
||||
|
||||
export enum SecuritySeverity {
|
||||
LOW = 'low',
|
||||
MEDIUM = 'medium',
|
||||
HIGH = 'high',
|
||||
CRITICAL = 'critical',
|
||||
}
|
||||
|
||||
export interface SecurityEventData {
|
||||
eventType: SecurityEventType;
|
||||
severity: SecuritySeverity;
|
||||
timestamp: string;
|
||||
source: string;
|
||||
details: Record<string, unknown>;
|
||||
userAgent?: string;
|
||||
remoteAddr?: string;
|
||||
ccnPubkey?: string;
|
||||
userId?: string;
|
||||
eventId?: string;
|
||||
subscriptionId?: string;
|
||||
risk_score?: number;
|
||||
}
|
||||
|
||||
class SecurityLogger {
|
||||
private readonly eventCounts = new Map<SecurityEventType, number>();
|
||||
private readonly lastEventTime = new Map<SecurityEventType, number>();
|
||||
|
||||
logSecurityEvent(data: Omit<SecurityEventData, 'timestamp'>): void {
|
||||
const eventData: SecurityEventData = {
|
||||
...data,
|
||||
timestamp: new Date().toISOString(),
|
||||
};
|
||||
|
||||
this.updateEventTracking(data.eventType);
|
||||
|
||||
switch (data.severity) {
|
||||
case SecuritySeverity.CRITICAL:
|
||||
log.error(`SECURITY_CRITICAL: ${data.eventType}`, eventData);
|
||||
break;
|
||||
case SecuritySeverity.HIGH:
|
||||
log.error(`SECURITY_HIGH: ${data.eventType}`, eventData);
|
||||
break;
|
||||
case SecuritySeverity.MEDIUM:
|
||||
log.warn(`SECURITY_MEDIUM: ${data.eventType}`, eventData);
|
||||
break;
|
||||
case SecuritySeverity.LOW:
|
||||
log.info(`SECURITY_LOW: ${data.eventType}`, eventData);
|
||||
break;
|
||||
}
|
||||
}
|
||||
|
||||
logAuthEvent(
|
||||
eventType: SecurityEventType,
|
||||
success: boolean,
|
||||
details: Record<string, unknown>,
|
||||
remoteAddr?: string,
|
||||
): void {
|
||||
this.logSecurityEvent({
|
||||
eventType,
|
||||
severity: success ? SecuritySeverity.LOW : SecuritySeverity.MEDIUM,
|
||||
source: 'authentication',
|
||||
details: { success, ...details },
|
||||
remoteAddr,
|
||||
});
|
||||
}
|
||||
|
||||
logCCNViolation(
|
||||
eventType: SecurityEventType,
|
||||
ccnPubkey: string,
|
||||
attemptedAction: string,
|
||||
details: Record<string, unknown>,
|
||||
): void {
|
||||
this.logSecurityEvent({
|
||||
eventType,
|
||||
severity: SecuritySeverity.HIGH,
|
||||
source: 'ccn_boundary',
|
||||
ccnPubkey,
|
||||
details: { attemptedAction, ...details },
|
||||
risk_score: 8.5,
|
||||
});
|
||||
}
|
||||
|
||||
logCryptoFailure(
|
||||
eventType: SecurityEventType,
|
||||
operation: string,
|
||||
details: Record<string, unknown>,
|
||||
): void {
|
||||
this.logSecurityEvent({
|
||||
eventType,
|
||||
severity: SecuritySeverity.MEDIUM,
|
||||
source: 'cryptography',
|
||||
details: { operation, ...details },
|
||||
});
|
||||
}
|
||||
|
||||
logDoSEvent(
|
||||
eventType: SecurityEventType,
|
||||
details: Record<string, unknown>,
|
||||
remoteAddr?: string,
|
||||
): void {
|
||||
this.logSecurityEvent({
|
||||
eventType,
|
||||
severity: SecuritySeverity.HIGH,
|
||||
source: 'dos_protection',
|
||||
details,
|
||||
remoteAddr,
|
||||
risk_score: 7.0,
|
||||
});
|
||||
}
|
||||
|
||||
logSystemEvent(
|
||||
eventType: SecurityEventType,
|
||||
details: Record<string, unknown>,
|
||||
): void {
|
||||
this.logSecurityEvent({
|
||||
eventType,
|
||||
severity: SecuritySeverity.LOW,
|
||||
source: 'system',
|
||||
details,
|
||||
});
|
||||
}
|
||||
|
||||
private updateEventTracking(eventType: SecurityEventType): void {
|
||||
const now = Date.now();
|
||||
const count = this.eventCounts.get(eventType) || 0;
|
||||
this.eventCounts.set(eventType, count + 1);
|
||||
this.lastEventTime.set(eventType, now);
|
||||
}
|
||||
}
|
||||
|
||||
export const securityLogger = new SecurityLogger();
|
||||
|
||||
export const logSecurityEvent = (data: Omit<SecurityEventData, 'timestamp'>) =>
|
||||
securityLogger.logSecurityEvent(data);
|
||||
|
||||
export const logAuthEvent = (
|
||||
eventType: SecurityEventType,
|
||||
success: boolean,
|
||||
details: Record<string, unknown>,
|
||||
remoteAddr?: string,
|
||||
) => securityLogger.logAuthEvent(eventType, success, details, remoteAddr);
|
||||
|
||||
export const logCCNViolation = (
|
||||
eventType: SecurityEventType,
|
||||
ccnPubkey: string,
|
||||
attemptedAction: string,
|
||||
details: Record<string, unknown>,
|
||||
) =>
|
||||
securityLogger.logCCNViolation(
|
||||
eventType,
|
||||
ccnPubkey,
|
||||
attemptedAction,
|
||||
details,
|
||||
);
|
||||
|
||||
export const logCryptoFailure = (
|
||||
eventType: SecurityEventType,
|
||||
operation: string,
|
||||
details: Record<string, unknown>,
|
||||
) => securityLogger.logCryptoFailure(eventType, operation, details);
|
||||
|
||||
export const logDoSEvent = (
|
||||
eventType: SecurityEventType,
|
||||
details: Record<string, unknown>,
|
||||
remoteAddr?: string,
|
||||
) => securityLogger.logDoSEvent(eventType, details, remoteAddr);
|
||||
|
||||
export const logSystemEvent = (
|
||||
eventType: SecurityEventType,
|
||||
details: Record<string, unknown>,
|
||||
) => securityLogger.logSystemEvent(eventType, details);
|
147
utils.ts
147
utils.ts
|
@ -1,19 +1,21 @@
|
|||
import { exists } from "jsr:@std/fs";
|
||||
import * as nostrTools from "@nostr/tools";
|
||||
import * as nip06 from "@nostr/tools/nip06";
|
||||
import { decodeBase64, encodeBase64 } from "jsr:@std/encoding@0.224/base64";
|
||||
import { getEveFilePath } from "./utils/files.ts";
|
||||
import * as nostrTools from '@nostr/tools';
|
||||
import * as nip06 from '@nostr/tools/nip06';
|
||||
import type { Database } from 'jsr:@db/sqlite';
|
||||
import { decodeBase64, encodeBase64 } from 'jsr:@std/encoding@0.224/base64';
|
||||
import { exists } from 'jsr:@std/fs';
|
||||
import {
|
||||
decryptUint8Array,
|
||||
encryptionKey,
|
||||
encryptUint8Array,
|
||||
} from "./utils/encryption.ts";
|
||||
encryptionKey,
|
||||
} from './utils/encryption.ts';
|
||||
import { getEveFilePath } from './utils/files.ts';
|
||||
import { sql } from './utils/queries.ts';
|
||||
|
||||
export function isLocalhost(req: Request): boolean {
|
||||
const url = new URL(req.url);
|
||||
const hostname = url.hostname;
|
||||
return (
|
||||
hostname === "127.0.0.1" || hostname === "::1" || hostname === "localhost"
|
||||
hostname === '127.0.0.1' || hostname === '::1' || hostname === 'localhost'
|
||||
);
|
||||
}
|
||||
|
||||
|
@ -38,28 +40,123 @@ export function randomTimeUpTo2DaysInThePast() {
|
|||
);
|
||||
}
|
||||
|
||||
export async function getCCNPubkey(): Promise<string> {
|
||||
const ccnPubPath = await getEveFilePath("ccn.pub");
|
||||
const doWeHaveKey = await exists(ccnPubPath);
|
||||
if (doWeHaveKey) return Deno.readTextFileSync(ccnPubPath);
|
||||
const ccnSeed = Deno.env.get("CCN_SEED") || nip06.generateSeedWords();
|
||||
/**
|
||||
* Get all CCNs from the database
|
||||
*/
|
||||
export function getAllCCNs(db: Database): { pubkey: string; name: string }[] {
|
||||
return sql`SELECT pubkey, name FROM ccns`(db) as {
|
||||
pubkey: string;
|
||||
name: string;
|
||||
}[];
|
||||
}
|
||||
|
||||
/**
|
||||
* Create a new CCN and store it in the database
|
||||
*
|
||||
* @param db - The database instance
|
||||
* @param name - The name of the CCN
|
||||
* @param seed - The seed words for the CCN
|
||||
* @returns The public key and private key of the CCN
|
||||
*/
|
||||
export async function createNewCCN(
|
||||
db: Database,
|
||||
name: string,
|
||||
creator: string,
|
||||
seed?: string,
|
||||
): Promise<{ pubkey: string; privkey: Uint8Array }> {
|
||||
const ccnSeed = seed || nip06.generateSeedWords();
|
||||
const ccnPrivateKey = nip06.privateKeyFromSeedWords(ccnSeed);
|
||||
const ccnPublicKey = nostrTools.getPublicKey(ccnPrivateKey);
|
||||
|
||||
const ccnSeedPath = await getEveFilePath(`ccn_seeds/${ccnPublicKey}`);
|
||||
const ccnPrivPath = await getEveFilePath(`ccn_keys/${ccnPublicKey}`);
|
||||
|
||||
await Deno.mkdir(await getEveFilePath('ccn_seeds'), { recursive: true });
|
||||
await Deno.mkdir(await getEveFilePath('ccn_keys'), { recursive: true });
|
||||
|
||||
const encryptedPrivateKey = encryptUint8Array(ccnPrivateKey, encryptionKey);
|
||||
|
||||
Deno.writeTextFileSync(ccnPubPath, ccnPublicKey);
|
||||
Deno.writeTextFileSync(
|
||||
await getEveFilePath("ccn.priv"),
|
||||
encodeBase64(encryptedPrivateKey),
|
||||
);
|
||||
Deno.writeTextFileSync(await getEveFilePath("ccn.seed"), ccnSeed);
|
||||
Deno.writeTextFileSync(ccnSeedPath, ccnSeed);
|
||||
Deno.writeTextFileSync(ccnPrivPath, encodeBase64(encryptedPrivateKey));
|
||||
|
||||
return ccnPublicKey;
|
||||
db.run('BEGIN TRANSACTION');
|
||||
|
||||
sql`INSERT INTO ccns (pubkey, name) VALUES (${ccnPublicKey}, ${name})`(db);
|
||||
sql`INSERT INTO allowed_writes (ccn_pubkey, pubkey) VALUES (${ccnPublicKey}, ${creator})`(
|
||||
db,
|
||||
);
|
||||
|
||||
db.run('COMMIT TRANSACTION');
|
||||
|
||||
return {
|
||||
pubkey: ccnPublicKey,
|
||||
privkey: ccnPrivateKey,
|
||||
};
|
||||
}
|
||||
|
||||
export async function getCCNPrivateKey(): Promise<Uint8Array> {
|
||||
const encryptedPrivateKey = Deno.readTextFileSync(
|
||||
await getEveFilePath("ccn.priv"),
|
||||
);
|
||||
return decryptUint8Array(decodeBase64(encryptedPrivateKey), encryptionKey);
|
||||
/**
|
||||
* Get the private key for a specific CCN
|
||||
*/
|
||||
export async function getCCNPrivateKeyByPubkey(
|
||||
pubkey: string,
|
||||
): Promise<Uint8Array> {
|
||||
const ccnPrivPath = await getEveFilePath(`ccn_keys/${pubkey}`);
|
||||
|
||||
if (await exists(ccnPrivPath)) {
|
||||
const encryptedPrivateKey = Deno.readTextFileSync(ccnPrivPath);
|
||||
return decryptUint8Array(decodeBase64(encryptedPrivateKey), encryptionKey);
|
||||
}
|
||||
|
||||
throw new Error(`CCN private key for ${pubkey} not found`);
|
||||
}
|
||||
|
||||
export function isReplaceableEvent(kind: number): boolean {
|
||||
return (kind >= 10000 && kind < 20000) || kind === 0 || kind === 3;
|
||||
}
|
||||
|
||||
export function isAddressableEvent(kind: number): boolean {
|
||||
return kind >= 30000 && kind < 40000;
|
||||
}
|
||||
|
||||
export function isRegularEvent(kind: number): boolean {
|
||||
return (
|
||||
(kind >= 1000 && kind < 10000) ||
|
||||
(kind >= 4 && kind < 45) ||
|
||||
kind === 1 ||
|
||||
kind === 2
|
||||
);
|
||||
}
|
||||
|
||||
export function isCCNReplaceableEvent(kind: number): boolean {
|
||||
return kind >= 60000 && kind < 65536;
|
||||
}
|
||||
|
||||
export function parseATagQuery(aTagValue: string): {
|
||||
kind: number;
|
||||
pubkey: string;
|
||||
dTag?: string;
|
||||
} {
|
||||
const parts = aTagValue.split(':');
|
||||
if (parts.length < 2) return { kind: 0, pubkey: '' };
|
||||
|
||||
return {
|
||||
kind: Number.parseInt(parts[0], 10),
|
||||
pubkey: parts[1],
|
||||
dTag: parts.length > 2 ? parts[2] : undefined,
|
||||
};
|
||||
}
|
||||
|
||||
/**
|
||||
* Get the single active CCN from the database
|
||||
* @returns The active CCN or null if none is active
|
||||
*/
|
||||
export function getActiveCCN(
|
||||
db: Database,
|
||||
): { pubkey: string; name: string } | null {
|
||||
const result = sql`SELECT pubkey, name FROM ccns WHERE is_active = 1 LIMIT 1`(
|
||||
db,
|
||||
);
|
||||
return result.length > 0
|
||||
? (result[0] as { pubkey: string; name: string })
|
||||
: null;
|
||||
}
|
||||
|
|
|
@ -1,31 +0,0 @@
|
|||
import { exists } from "jsr:@std/fs";
|
||||
|
||||
/**
|
||||
* Return the path to Eve's configuration directory.
|
||||
*
|
||||
* The configuration directory is resolved in the following order:
|
||||
* 1. The value of the `XDG_CONFIG_HOME` environment variable.
|
||||
* 2. The value of the `HOME` environment variable, with `.config` appended.
|
||||
*
|
||||
* If the resolved path does not exist, create it.
|
||||
*/
|
||||
export async function getEveConfigHome(): Promise<string> {
|
||||
const xdgConfigHome = Deno.env.get("XDG_CONFIG_HOME") ??
|
||||
`${Deno.env.get("HOME")}/.config`;
|
||||
const storagePath = `${xdgConfigHome}/arx/Eve`;
|
||||
if (!(await exists(storagePath))) {
|
||||
await Deno.mkdir(storagePath, { recursive: true });
|
||||
}
|
||||
return storagePath;
|
||||
}
|
||||
|
||||
/**
|
||||
* Return the path to the file in Eve's configuration directory.
|
||||
*
|
||||
* @param file The name of the file to return the path for.
|
||||
* @returns The path to the file in Eve's configuration directory.
|
||||
*/
|
||||
export async function getEveFilePath(file: string): Promise<string> {
|
||||
const storagePath = await getEveConfigHome();
|
||||
return `${storagePath}/${file}`;
|
||||
}
|
|
@ -1,75 +0,0 @@
|
|||
import * as colors from "jsr:@std/fmt@^1.0.4/colors";
|
||||
import * as log from "jsr:@std/log";
|
||||
import { getEveFilePath } from "./files.ts";
|
||||
export * as log from "jsr:@std/log";
|
||||
|
||||
export async function setupLogger() {
|
||||
const formatLevel = (level: number): string => {
|
||||
return (
|
||||
{
|
||||
10: colors.gray("[DEBUG]"),
|
||||
20: colors.green("[INFO] "),
|
||||
30: colors.yellow("[WARN] "),
|
||||
40: colors.red("[ERROR]"),
|
||||
50: colors.bgRed("[FATAL]"),
|
||||
}[level] || `[LVL${level}]`
|
||||
);
|
||||
};
|
||||
|
||||
const levelName = (level: number): string => {
|
||||
return {
|
||||
10: "DEBUG",
|
||||
20: "INFO",
|
||||
30: "WARN",
|
||||
40: "ERROR",
|
||||
50: "FATAL",
|
||||
}[level] || `LVL${level}`;
|
||||
};
|
||||
|
||||
const formatArg = (arg: unknown): string => {
|
||||
if (typeof arg === "object") return JSON.stringify(arg);
|
||||
return String(arg);
|
||||
};
|
||||
|
||||
await log.setup({
|
||||
handlers: {
|
||||
console: new log.ConsoleHandler("DEBUG", {
|
||||
useColors: true,
|
||||
formatter: (record) => {
|
||||
const timestamp = new Date().toISOString();
|
||||
let msg = `${colors.dim(`[${timestamp}]`)} ${
|
||||
formatLevel(record.level)
|
||||
} ${record.msg}`;
|
||||
|
||||
if (record.args.length > 0) {
|
||||
const args = record.args
|
||||
.map((arg, i) => `${colors.dim(`arg${i}:`)} ${formatArg(arg)}`)
|
||||
.join(" ");
|
||||
msg += ` ${colors.dim("|")} ${args}`;
|
||||
}
|
||||
|
||||
return msg;
|
||||
},
|
||||
}),
|
||||
file: new log.FileHandler("DEBUG", {
|
||||
filename: Deno.env.get("LOG_FILE") ||
|
||||
await getEveFilePath("eve-logs.jsonl"),
|
||||
formatter: (record) => {
|
||||
const timestamp = new Date().toISOString();
|
||||
return JSON.stringify({
|
||||
timestamp,
|
||||
level: levelName(record.level),
|
||||
msg: record.msg,
|
||||
args: record.args,
|
||||
});
|
||||
},
|
||||
}),
|
||||
},
|
||||
loggers: {
|
||||
default: {
|
||||
level: "DEBUG",
|
||||
handlers: ["console", "file"],
|
||||
},
|
||||
},
|
||||
});
|
||||
}
|
Loading…
Add table
Add a link
Reference in a new issue