Merge pull request #139 from ashpect/dev

Prove name is not in Ofac List
This commit is contained in:
turboblitz
2024-08-13 13:05:31 -07:00
committed by GitHub
32 changed files with 101537 additions and 73 deletions

30
common/ofacdata/ReadMe.md Normal file
View File

@@ -0,0 +1,30 @@
# How we Process Ofac Lists :
## Data Collection :
- We collect the data from the official website of the U.S. Department of the Treasury's Office of Foreign Assets Control (OFAC) and download the data in the form of a CSV file from [here](https://sanctionslist.ofac.treas.gov/Home/SdnList)
- The SDN list contains the names of individuals, entities and groups designated by OFAC as well as the listing of maritime vessels and aircraft that are blocked by OFAC.
### ofacdata/original
- The data is stored in the form of 2 CSV files named `sdn.csv` and `add.csv`. `dataspec.txt` explains the data specification for the CSV data files.
- The data is cleaned to obtain the required information for individuals from sdn.csv file.
A ballpark number of 6917 individuals (at the time of writing this document) entries are present in sdn.csv. Remaining entries are entities, vessels, and aircrafts.
## Data Processing :
### ofacdata/scripts
- The `ofac.ipynb` script extracts the data from both the csv's and parses them in json fommat.
- We parse all ethereum addresses, regardless of individual or entity in eth_addresses.json.
- For individuals, we parse :
- full name (first name, last name), dob(day, month, year) in names.json
- passports and passport issuing country in passport.json
- The jsons are stored at ofacdata/inputes to be used furthur by SMT's.
## Data Usage :
These jsons are later used to create sparse merkle trees for non-membership proofs. We provide 3 levels of proofs.
- Match through Passport Number : level 3 (Absolute Match)
- Match through Names and Dob combo tree : level 2 (High Probability Match)
- Match only through Names : level 1 (Partial Match)
The merkle tree is also exported as json in ofacdata/outputs for time constraint export and import.
Check out src/ofacTree.ts for more details.<br>

View File

@@ -0,0 +1,149 @@
[
{
"Eth_address": "0x098B716B8Aaf21512996dC57EB0615e2383E2f96"
},
{
"Eth_address": "0xa0e1c89Ef1a489c9C7dE96311eD5Ce5D32c20E4B"
},
{
"Eth_address": "0x3Cffd56B47B7b41c56258D9C7731ABaDc360E073"
},
{
"Eth_address": "0x53b6936513e738f44FB50d2b9476730C0Ab3Bfc1"
},
{
"Eth_address": "0x35fB6f6DB4fb05e6A4cE86f2C93691425626d4b1"
},
{
"Eth_address": "0xF7B31119c2682c88d88D455dBb9d5932c65Cf1bE"
},
{
"Eth_address": "0x3e37627dEAA754090fBFbb8bd226c1CE66D255e9"
},
{
"Eth_address": "0x08723392Ed15743cc38513C4925f5e6be5c17243"
},
{
"Eth_address": "0x7F367cC41522cE07553e823bf3be79A889DEbe1B"
},
{
"Eth_address": "0xd882cfc20f52f2599d84b8e8d58c7fb62cfe344b"
},
{
"Eth_address": "0x901bb9583b24d97e995513c6778dc6888ab6870e"
},
{
"Eth_address": "0xa7e5d5a720f06526557c513402f2e6b5fa20b008"
},
{
"Eth_address": "0x9f4cda013e354b8fc285bf4b9a60460cee7f7ea9"
},
{
"Eth_address": "0x3cbded43efdaf0fc77b9c55f6fc9988fcc9b757d"
},
{
"Eth_address": "0x7FF9cFad3877F21d41Da833E2F775dB0569eE3D9"
},
{
"Eth_address": "0xc2a3829F459B3Edd87791c74cD45402BA0a20Be3"
},
{
"Eth_address": "0x3AD9dB589d201A710Ed237c829c7860Ba86510Fc"
},
{
"Eth_address": "0x12D66f87A04A9E220743712cE6d9bB1B5616B8Fc"
},
{
"Eth_address": "0x47CE0C6eD5B0Ce3d3A51fdb1C52DC66a7c3c2936"
},
{
"Eth_address": "0x910Cbd523D972eb0a6f4cAe4618aD62622b39DbF"
},
{
"Eth_address": "0xA160cdAB225685dA1d56aa342Ad8841c3b53f291"
},
{
"Eth_address": "0xD4B88Df4D29F5CedD6857912842cff3b20C8Cfa3"
},
{
"Eth_address": "0xFD8610d20aA15b7B2E3Be39B396a1bC3516c7144"
},
{
"Eth_address": "0x07687e702b410Fa43f4cB4Af7FA097918ffD2730"
},
{
"Eth_address": "0x23773E65ed146A459791799d01336DB287f25334"
},
{
"Eth_address": "0x22aaA7720ddd5388A3c0A3333430953C68f1849b"
},
{
"Eth_address": "0x03893a7c7463AE47D46bc7f091665f1893656003"
},
{
"Eth_address": "0x2717c5e28cf931547B621a5dddb772Ab6A35B701"
},
{
"Eth_address": "0xD21be7248e0197Ee08E0c20D4a96DEBdaC3D20Af"
},
{
"Eth_address": "0x39D908dac893CBCB53Cc86e0ECc369aA4DeF1A29"
},
{
"Eth_address": "0x4f47bc496083c727c5fbe3ce9cdf2b0f6496270c"
},
{
"Eth_address": "0x97b1043abd9e6fc31681635166d430a458d14f9c"
},
{
"Eth_address": "0xb6f5ec1a0a9cd1526536d3f0426c429529471f40"
},
{
"Eth_address": "0x9c2bc757b66f24d60f016b6237f8cdd414a879fa"
},
{
"Eth_address": "0xdcbEfFBECcE100cCE9E4b153C4e15cB885643193"
},
{
"Eth_address": "0x5f48c2a71b2cc96e3f0ccae4e39318ff0dc375b2"
},
{
"Eth_address": "0x5a7a51bfb49f190e5a6060a5bc6052ac14a3b59f"
},
{
"Eth_address": "0xed6e0a7e4ac94d976eebfb82ccf777a3c6bad921"
},
{
"Eth_address": "0x797d7ae72ebddcdea2a346c1834e04d1f8df102b"
},
{
"Eth_address": "0x931546D9e66836AbF687d2bc64B30407bAc8C568"
},
{
"Eth_address": "0x43fa21d92141BA9db43052492E0DeEE5aa5f0A93"
},
{
"Eth_address": "0x6be0ae71e6c41f2f9d0d1a3b8d0f75e6f6a0b46e"
},
{
"Eth_address": "0x530a64c0ce595026a4a556b703644228179e2d57"
},
{
"Eth_address": "0x983a81ca6FB1e441266D2FbcB7D8E530AC2E05A2"
},
{
"Eth_address": "0x961c5be54a2ffc17cf4cb021d863c42dacd47fc1"
},
{
"Eth_address": "0xE950DC316b836e4EeFb8308bf32Bf7C72a1358FF"
},
{
"Eth_address": "0x21B8d56BDA776bbE68655A16895afd96F5534feD"
},
{
"Eth_address": "0xf3701f445b6bdafedbca97d1e477357839e4120d"
},
{
"Eth_address": "0x19F8f2B0915Daa12a3f5C9CF01dF9E24D53794F7"
}
]

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,308 @@
OFFICE OF FOREIGN ASSETS CONTROL
U.S. TREASURY DEPARTMENT
SPECIALLY DESIGNATED NATIONALS AND BLOCKED PERSONS
DATA SPECIFICATION
First Released: 12/06/2004
Updated: 09/25/2023
PLEASE NOTE IMPORTANT INFORMATION REGARDING THE TECHNICAL STRUCTURE
OF THESE FILES ARE FEATURED AT THE BOTTOM OF THE DOCUMENT
OFAC is now publishing its list of Specially Designated Nationals in XML
and Comma delimited (CSV) format. These digital publications as
transmitted by OFAC are designed as reference tools providing actual notice of
actions by OFAC with respect to Specially Designated Nationals and other
entities whose property is blocked, to assist the public in complying with the
various sanctions programs administered by OFAC. The latest changes may appear
here prior to their publication in the Federal Register, and it is intended that
users rely on changes indicated in these documents that post-date the most
recent Federal Register publication with respect to a particular sanctions
program in the appendices to chapter V of Title 31, Code of Federal Regulations.
Such changes reflect official actions of OFAC, and will be reflected as soon as
practicable in the Federal Register under the index heading "Foreign Assets
Control." New Federal Register notices with regard to Specially Designated
Nationals or blocked entities may be published at any time. Users are advised
to check the Federal Register and these electronic publications routinely for
additional names or other changes to the listings. Entities and individuals on
the list are occasionally licensed by OFAC to transact business with U.S.
persons in anticipation of removal from the list or because of foreign policy
considerations in unique circumstances. Licensing in anticipation of official
Federal Register publication of a notice of removal based on the unblocking of
an entity's or individual's property is reflected in these publications by
removal from the list. Current information on licenses issued with regard to
Specially Designated Nationals and other blocked persons may be obtained or
verified by calling OFAC Licensing at (202) 622-2480.
Technical Specification:
Format *.ff consists of records separated by carriage returns, with fields
within the records beginning at fixed locations.
Format *.csv consistes of records seperated by carriage returns (ASCII character
13), with fields (values) within records delimited by the "," (comma) symbol
(ASCII character 44).
Null values for all four formats consist of "-0-" (ASCII characters 45, 48, 45).
The Comma Seperated Values (.csv), and Fixed-Field (.ff) releases consist of three ASCII text files--a main
file listing the name of the SDN and other information unique to that entity
(sdn.*), a file of addresses (add.*), and a file of alternate names (alt.*).
Addresses and alternate names are linked to particular SDNs using unique integer
values in a linking or primary key column. The integers used are assigned for
linking purposes only and do not represent an official reference to that entity.
Releases of the database-format files are intended as a service to the user
community. OFAC's SDN list is published in the Federal Register. All of OFAC's
lists are drawn from the same underlying data and every effort has been made to
ensure consistency. The Federal Register will govern should differences arise.
Due to the nature, urgency, and sensitivity of the programs which OFAC
administers and enforces, it may not always be possible to provide advanced
notice to users of format changes to the database structure.
The files associated with each release are:
fixed field: SDN.FF, ADD.FF, ALT.FF, SDN_COMMENTS.FF
Comma delimited: SDN.CSV, ADD.CSV, ALT.CSV, SDN COMMENTS.CSV
XML: SDN.XML, SDN_ADVANCED.XML
Misc: dat_spec.txt (this file), sdn.xsd (XML SDN schema),
sdn_advanced.xsd (advanced XML SDN schema).
FORMAT SDN FIXED FIELD
Main table, text file name SDN.FF
Column Posi-
sequence Column name Type Size tion Description
-------- ----------- ------- ---- ---- ---------------------
1 ent_num number 10 10 unique record
identifier/unique
listing identifier
2 SDN_Name text 350 11 name of sdn
3 SDN_Type text 12 361 type of SDN
4 Program text 200 373 sanctions program name
5 Title text 200 573 title of an individual
6 Call_Sign text 8 773 vessel call sign
7 Vess_type text 25 781 vessel type
8 Tonnage text 14 806 vessel tonnage
9 GRT text 8 820 gross registered
tonnage
10 Vess_flag text 40 828 vessel flag
11 Vess_owner text 150 868 vessel owner
12 Remarks text 1000 1018 remarks on SDN*
END OF ROW 2018
Address table, text file name ADD.FF
Column Posi-
sequence Column name Type Size tion Description
-------- ----------- ------- ---- ---- ---------------------
1 Ent_num number 10 1 link to unique listing
2 Add_num number 10 11 unique record
identifier
3 Address text 750 21 street address of SDN
4 City/ text 116 771 city, state/province, zip/postal code
State/Province/
Postal Code
5 Country text 250 887 country of address
6 Add_remarks text 200 1137 remarks on address
END OF ROW 1337
Alternate identity table, text file name ALT.FF
Column Posi-
sequence Column name Type Size tion Description
-------- ----------- ------- ---- ---- ---------------------
1 ent_num number 10 1 link to unique listing
2 alt_num number 10 11 unique record
identifier
3 alt_type text 8 21 type of alternate
identity
(aka, fka, nka)
4 alt_name text 350 29 alternate identity name
5 alt_remarks text 200 379 remarks on alternate
identity
END OF ROW 579
Record separator: carriage return
null: -0-
FORMAT SDN CSV
Main table, text file name SDN.CSV
Column
sequence Column name Type Size Description
-------- ------------ ------- ---- ---------------------
1 ent_num number unique record
identifier/unique
listing identifier
2 SDN_Name text 350 name of SDN
3 SDN_Type text 12 type of SDN
4 Program text 200 sanctions program name
5 Title text 200 title of an individual
6 Call_Sign text 8 vessel call sign
7 Vess_type text 25 vessel type
8 Tonnage text 14 vessel tonnage
9 GRT text 8 gross registered tonnage
10 Vess_flag text 40 vessel flag
11 Vess_owner text 150 vessel owner
12 Remarks text 1000 remarks on SDN*
Address table, text file name ADD.CSV
Column
sequence Column name Type Size Description
-------- ------------ ------- ---- ---------------------
1 Ent_num number link to unique listing
2 Add_num number unique record identifier
3 Address text 750 street address of SDN
4 City/ text 116 city, state/province, zip/postal code
State/Province/
Postal Code
5 Country text 250 country of address
6 Add_remarks text 200 remarks on address
Alternate identity table, text file name ALT.CSV
Column
sequence Column name Type Size Description
-------- ------------ ------- ---- ---------------------
1 ent_num number link to unique listing
2 alt_num number unique record identifier
3 alt_type text 8 type of alternate identity
(aka, fka, nka)
4 alt_name text 350 alternate identity name
5 alt_remarks text 200 remarks on alternate identity
Record separator: carriage return
field (value) delimiter: ,
text value quotes: "
null: -0-
*SPILLOVER FILES:
OFAC has made certain changes to its SDN production system that now allow for
an unlimited number of identifiers, features and linked to identifications to
be added to a record. In the fixed-width and delimited files these data are
stored in the remarks field. Due to these changes, it is now possible for an
SDN record to exceed the 1000 character remarks limitation. Data that exceeds
the specified field limit will be truncated to ensure that the current data
specification is followed. However, in order to ensure that users of these
files continue to have access to truncated data, OFAC has created "spillover files."
These files will follow the same data specification of the files they are
associated with. However, there will be no upper limit on row length in these files.
The spillover file names are:
sdn_comments.csv
sdn_comments.ff
These files will be listed separately on the OFAC website's SDN page. They will also be listed
separately in the library/fac_dlim and /fac_delim folders of OFAC's FTP sites.
Please visit the following tutorial on OFAC's website for more information on
creating a database using these files:
https://ofac.treasury.gov/sdn-list-data-formats-data-schemas/tutorial-on-the-use-of-list-related-legacy-flat-files
THE DISPOSITION OF ALIASES:
OFAC classifies SDN aliases as weak or strong. In the data files
discussed in this document, weak aliases are not stored in the alt.* files.
Weak aliases are stored in the remarks field that trails every primary
SDN record in the SDN.* files. For more information on weak aliases
please review the following text taken from the frequently asked questions
on OFAC's website.
What are weak aliases (AKAs)?
A "weak AKA" is a term for a relatively broad or generic alias that
may generate a large volume of false hits. Weak AKAs include
nicknames, noms-de-guerre, and unusually common acronyms. OFAC
includes these AKAs because, based on information available to it, the
sanctions targets refer to themselves, or are referred to, by these
names. As a result, these AKAs may be useful for identification
purposes, particularly in confirming a possible "hit" or "match"
triggered by other identifier information. Realizing, however, the
large number of false hits that these names may generate, OFAC
qualitatively distinguishes them from other AKAs by designating them
as weak. OFAC has instituted procedures that attempt to make this
qualitative review of aliases as objective as possible. Before
issuing this updated guidance, OFAC conducted a review of all aliases
on the SDN list. Each SDN alias was run through a computer program
that evaluated the potential of an alias to produce false positives in
an automated screening environment. Names were evaluated using the
following criteria:
Character length (shorter strings were assumed to be less effective in
a screening environment than longer strings);
The presence of numbers in an alias (digits 0-9);
The presence of common words that are generally considered to
constitute a nickname (example: Ahmed the Tall);
References in the alias to geographic locations (example: Ahmed the
Sudanese);
The presence of very common prefixes in a name where the prefix was
one of only two strings in a name (example: Mr. Smith).
Aliases that met one or more of the above criteria were flagged for
human review. OFAC subject matter experts then reviewed each of the
automated recommendations and made final decisions on the flagging of
each alias.*
OFAC intends to use these procedures to evaluate all new aliases
introduced to the SDN list.
Where can I find weak aliases (AKAs)?
Weak AKAs appear differently depending on which file format of the SDN
List is utilized.
In the TXT and PDF versions of the SDN List, weak AKAs are
encapsulated in double-quotes within the AKA listing:
ALLANE, Hacene (a.k.a. ABDELHAY, al-Sheikh; a.k.a. AHCENE, Cheib;
a.k.a. "ABU AL-FOUTOUH"; a.k.a. "BOULAHIA"; a.k.a. "HASSAN THE OLD");
DOB 17 Jan 1941; POB El Menea, Algeria (individual) [SDGT]
This convention also is followed in the alphabetical listing published
in Appendix A to Chapter V of Title 31 of the Code of Federal
Regulations.
In the FF, and CSV file formats, weak AKAs are listed in the
Remarks field (found at the end of the record) of the SDN file. In
these formats, weak AKAs are bracketed by quotation marks.
8219@"ALLANE, Hacene"@"individual"@"SDGT"@-0- @-0- @-0- @-0- @-0- @-0-
@-0- @"DOB 17 Jan 1941; POB El Menea, Algeria; a.k.a. 'ABU
AL-FOUTOUH'; a.k.a. 'BOULAHIA'; a.k.a. 'HASSAN THE OLD'."
In the XML version of the SDN List, there is a Type element for each
AKA. The Type can either be 'weak' or 'strong' (see the XML SDN
Schema (XSD file) at:
http://www.treasury.gov/resource-center/sanctions/SDN-List/Documents/sdn.xsd for more
information).
Am I required to screen for weak aliases (AKAs)?
OFAC's regulations do not explicitly require any specific screening
regime. Financial institutions and others must make screening choices
based on their circumstances and compliance approach. As a general
matter, though, OFAC does not expect that persons will screen for weak
AKAs, but expects that such AKAs may be used to help determine whether
a "hit" arising from other information is accurate.
Will I be penalized for processing an unauthorized transaction
involving a weak alias (AKA)?
A person who processes an unauthorized transaction involving an SDN
has violated U.S. law and may be subject to an enforcement action.
Generally speaking, however, if (i) the only sanctions reference in
the transaction is a weak AKA, (ii) the person involved in the
processing had no other reason to know that the transaction involved
an SDN or was otherwise in violation of U.S. law, and (iii) the person
maintains a rigorous risk-based compliance program, OFAC will not
issue a civil penalty against an individual or entity for processing
such a transaction.

File diff suppressed because it is too large Load Diff

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

View File

@@ -1,5 +1,6 @@
{
"dependencies": {
"@ashpect/smt": "https://github.com/ashpect/smt#main",
"@babel/runtime": "^7.23.4",
"@zk-kit/imt": "https://gitpkg.now.sh/0xturboblitz/zk-kit/packages/imt?6d417675",
"@zk-kit/lean-imt": "^2.0.1",

View File

@@ -342,11 +342,642 @@ export const mockPassportData_sha512_ecdsa = {
photoBase64: 'iVBORw0KGgoAAAANSUhEUgAAABAAAAAQCAYAAAAf8/9hAAABjElEQVR42mL8//8/AyUYiBQYmIw3...',
};
export const mockPassportData2_sha256_rsa_65537 = {
"mrz": "P<FRATESTIN<<ABCDEFGH<ABCDEF<ABCDEF<<<<<<<<<42GH123454FRA0001011M4022225<<<<<<<<<<<<<<22",
"signatureAlgorithm": "sha256WithRSASSAPSS",
"pubKey": {
"modulus": "24462187253413274681146293990014601117483150253485750502784042435672184694412963307122026240846907391312882376801424642119473345751861224453041335405750030091821974208795494089279845074559882616814677854700627123408815125641207116387150180075958682953326415376187334908428885389819481874887447587379894859906385655519567588675165038354987379327125622417796020195813417774532495071662150990707566936780047952622227454986438290561518433120591444215515025611804148686981931860883842745085825036432179109865379231910853752492302597965268640127145219080386320748404798990484447108886155687702517815916586904444799519356177",
"exponent": "65537"
},
"dataGroupHashes": [
-108,
31,
-26,
62,
-36,
-36,
113,
-93,
88,
-94,
90,
-108,
70,
-11,
-101,
1,
76,
-53,
60,
-106,
63,
-124,
-107,
-28,
111,
97,
47,
66,
120,
-9,
123,
-5,
-13,
-49,
82,
60,
28,
116,
89,
-6,
-96,
-67,
123,
28,
36,
-4,
-68,
75,
-15,
-20,
119,
91,
50,
-33,
-2,
-2,
-38,
-16,
44,
-13,
112,
102,
-66,
82,
-76,
-21,
-34,
33,
79,
50,
-104,
-120,
-114,
35,
116,
-32,
6,
-14,
-100,
-115,
-128,
-8,
10,
61,
98,
86,
-8,
45,
-49,
-46,
90,
-24,
-81,
38,
0,
-62,
104,
108,
-19,
-10,
97,
-26,
116,
-58,
69,
110,
26,
87,
17,
89,
110,
-57,
108,
-6,
36,
21,
39,
87,
110,
102,
-6,
-43,
-82,
-125,
-85,
-82,
-120,
-101,
87,
-112,
111,
15,
-104,
127,
85,
25,
-102,
81,
20,
58,
51,
75,
-63,
116,
-22,
0,
60,
30,
29,
30,
-73,
-115,
72,
-9,
-1,
-53,
100,
124,
41,
-22,
106,
78,
31,
11,
114,
-119,
-19,
17,
92,
71,
-122,
47,
62,
78,
-67,
-23,
-55,
-42,
53,
4,
47,
-67,
-55,
-123,
6,
121,
34,
-125,
64,
-114,
91,
-34,
-46,
-63,
62,
-34,
104,
82,
36,
41,
-118,
-3,
70,
15,
-108,
-48,
-100,
45,
105,
-85,
-15,
-61,
-71,
43,
-39,
-94,
-110,
-55,
-34,
89,
-18,
38,
76,
123,
-40,
13,
51,
-29,
72,
-11,
59,
-63,
-18,
-90,
103,
49,
23,
-92,
-85,
-68,
-62,
-59,
-100,
-69,
-7,
28,
-58,
95,
69,
15,
-74,
56,
54,
38
],
"eContent": [
49,
102,
48,
21,
6,
9,
42,
-122,
72,
-122,
-9,
13,
1,
9,
3,
49,
8,
6,
6,
103,
-127,
8,
1,
1,
1,
48,
28,
6,
9,
42,
-122,
72,
-122,
-9,
13,
1,
9,
5,
49,
15,
23,
13,
49,
57,
49,
50,
49,
54,
49,
55,
50,
50,
51,
56,
90,
48,
47,
6,
9,
42,
-122,
72,
-122,
-9,
13,
1,
9,
4,
49,
34,
4,
32,
-5,
100,
-110,
-122,
97,
101,
57,
83,
-95,
14,
7,
14,
-63,
83,
-57,
-104,
-21,
114,
-31,
45,
-31,
74,
-60,
58,
-37,
-106,
-113,
-80,
-49,
-112,
83,
77
],
"encryptedDigest": [
-97,
-108,
-50,
54,
29,
77,
47,
-128,
26,
-86,
6,
43,
103,
77,
54,
-105,
-112,
116,
63,
75,
-127,
9,
68,
112,
-55,
-91,
-9,
-17,
24,
55,
-31,
-31,
76,
-82,
79,
117,
-15,
46,
59,
-111,
-33,
-93,
-46,
-82,
116,
-35,
70,
-4,
-41,
-39,
-34,
-94,
99,
76,
22,
-62,
96,
106,
-118,
41,
-2,
-7,
-103,
-125,
-74,
-66,
111,
-5,
-120,
-76,
-108,
-106,
-59,
25,
-124,
-109,
57,
-108,
76,
0,
80,
-106,
-23,
116,
64,
35,
-79,
-93,
-3,
99,
-61,
-15,
-41,
-104,
-17,
-116,
-67,
-39,
42,
-34,
100,
61,
-66,
28,
46,
63,
118,
46,
59,
70,
124,
76,
74,
-38,
-43,
-73,
62,
-39,
-99,
58,
-53,
-56,
81,
-26,
43,
-6,
-93,
52,
-37,
-66,
-40,
-95,
70,
-118,
-67,
-55,
-8,
56,
48,
52,
75,
-5,
16,
64,
-85,
29,
37,
32,
40,
26,
-13,
7,
-61,
-49,
-48,
93,
-29,
-18,
-12,
-31,
89,
39,
-45,
-103,
-26,
-12,
5,
-29,
-88,
102,
-107,
-122,
15,
-43,
-57,
-44,
83,
100,
7,
116,
-125,
-78,
-6,
32,
46,
115,
106,
-81,
90,
-40,
99,
55,
-128,
41,
-20,
93,
-58,
71,
50,
-90,
-52,
-16,
112,
60,
97,
-84,
7,
-22,
-23,
-105,
78,
105,
3,
34,
-105,
-33,
112,
122,
-43,
24,
-34,
87,
24,
-95,
-51,
-81,
-29,
-31,
18,
-36,
-38,
-32,
-64,
-115,
13,
88,
-85,
6,
-77,
-122,
-36,
94,
-106,
-122,
97,
-43,
-110,
36,
41,
54,
91,
18,
16,
105,
-124,
-86
],
"photoBase64": "iVBORw0KGgoAAAANSUhEUgAAABAAAAAQCAYAAAAf8/9hAAABjElEQVR42mL8//8/AyUYiBQYmIw3..."
}
export const mockPassportDatas = [
mockPassportData_sha256_rsa_65537,
mockPassportData_sha256_sha1MRZ_rsa_65537,
mockPassportData_sha1_rsa_65537,
mockPassportData_sha256_rsapss_65537,
mockPassportData2_sha256_rsa_65537,
mockPassportData_sha1_ecdsa,
mockPassportData_sha256_ecdsa,
mockPassportData_sha512_ecdsa,

View File

@@ -11,15 +11,20 @@ import {
getHashLen,
getCurrentDateYYMMDD,
generateMerkleProof,
generateSMTProof,
findSubarrayIndex,
hexToDecimal,
} from './utils';
import { LeanIMT } from '@zk-kit/lean-imt';
import { getLeaf } from './pubkeyTree';
import { poseidon6 } from 'poseidon-lite';
import { packBytes } from '../utils/utils';
import { getCSCAModulusMerkleTree } from './csca';
import { mockPassportDatas } from '../constants/mockPassportData';
import { LeanIMT } from "@zk-kit/lean-imt";
import { getLeaf } from "./pubkeyTree";
import { getNameLeaf, getNameDobLeaf, getPassportNumberLeaf } from "./ofacTree";
import { poseidon6 } from "poseidon-lite";
import { packBytes } from "../utils/utils";
import { getCSCAModulusMerkleTree } from "./csca";
import {
mockPassportDatas,
} from "../constants/mockPassportData";
import { SMT } from "@ashpect/smt"
export function generateCircuitInputsRegister(
secret: string,
@@ -194,6 +199,46 @@ export function generateCircuitInputsDisclose(
};
}
export function generateCircuitInputsOfac(
secret: string,
attestation_id: string,
passportData: PassportData,
merkletree: LeanIMT,
majority: string,
bitmap: string[],
scope: string,
user_identifier: string,
sparsemerkletree: SMT,
proofLevel : number,
) {
const result = generateCircuitInputsDisclose(secret,attestation_id,passportData,merkletree,majority,bitmap,scope,user_identifier);
const { majority: _, scope: __, bitmap: ___, user_identifier: ____, ...finalResult } = result;
const mrz_bytes = formatMrz(passportData.mrz);
const passport_leaf = getPassportNumberLeaf(mrz_bytes.slice(49,58))
const namedob_leaf = getNameDobLeaf(mrz_bytes.slice(10,49), mrz_bytes.slice(62, 68)) // [57-62] + 5 shift
const name_leaf = getNameLeaf(mrz_bytes.slice(10,49)) // [6-44] + 5 shift
let root,closestleaf,siblings;
if(proofLevel == 3){
({root, closestleaf, siblings} = generateSMTProof(sparsemerkletree, passport_leaf));
} else if(proofLevel == 2){
({root, closestleaf, siblings} = generateSMTProof(sparsemerkletree, namedob_leaf));
} else if (proofLevel == 1){
({root, closestleaf, siblings} = generateSMTProof(sparsemerkletree, name_leaf));
} else {
throw new Error("Invalid proof level")
}
return {
...finalResult,
closest_leaf: [BigInt(closestleaf).toString()],
smt_root: [BigInt(root).toString()],
smt_siblings: siblings.map(index => BigInt(index).toString()),
};
}
// this get the commitment index whether it is a string or a bigint
// this is necessary rn because when the tree is send from the server in a serialized form,
// the bigints are converted to strings and I can't figure out how to use tree.import to load bigints there

View File

@@ -0,0 +1,172 @@
import { poseidon9, poseidon3, poseidon2, poseidon6, poseidon13 } from "poseidon-lite"
import { stringToAsciiBigIntArray } from "./utils";
import { ChildNodes,SMT } from "@ashpect/smt"
// SMT trees for 3 levels :
// 1. Passport tree : level 3 (Absolute Match)
// 2. Names and dob combo tree : level 2 (High Probability Match)
// 3. Names tree : level 1 (Partial Match)
export function buildSMT(field :any[], treetype:string): [number, number, SMT]{
let count = 0
let startTime = performance.now();
const hash2 = (childNodes: ChildNodes) => (childNodes.length === 2 ? poseidon2(childNodes) : poseidon3(childNodes))
const tree = new SMT(hash2, true)
for (let i = 0; i < field.length; i++) {
const entry = field[i]
if (i !== 0) {
console.log('Processing', treetype,'number', i, "out of", field.length);
}
let leaf = BigInt(0)
if (treetype == "passport") {
leaf = processPassport(entry.Pass_No, i)
} else if (treetype == "name_dob") {
leaf = processNameDob(entry, i)
} else if (treetype == "name"){
leaf = processName(entry.First_Name, entry.Last_Name, i)
}
if( leaf==BigInt(0) || tree.createProof(leaf).membership){
console.log("This entry already exists in the tree, skipping...")
continue
}
count += 1
tree.add(leaf,BigInt(1))
}
console.log("Total",treetype ,"paresed are : ",count ," over ",field.length )
console.log(treetype, 'tree built in', performance.now() - startTime, 'ms')
return [count, performance.now() - startTime, tree]
}
function processPassport(passno : string, index: number): bigint {
if (passno.length > 9) {
console.log('passport length is greater than 9:', index, passno)
} else if (passno.length < 9){
while (passno.length != 9) {
passno += '<'
}
}
const leaf = getPassportNumberLeaf(stringToAsciiBigIntArray(passno))
if (!leaf) {
console.log('Error creating leaf value', index, passno)
return BigInt(0)
}
return leaf
}
function processNameDob(entry: any, i: number): bigint {
const firstName = entry.First_Name
const lastName = entry.Last_Name
const day = entry.day
const month = entry.month
const year = entry.year
if(day == null || month == null || year == null){
console.log('dob is null', i, entry)
return BigInt(0)
}
const nameHash = processName(firstName,lastName,i)
const dobHash = processDob(day, month, year,i)
const leaf = poseidon2([dobHash, nameHash])
return leaf
}
function processName(firstName:string, lastName:string, i: number ): bigint {
// LASTNAME<<FIRSTNAME<MIDDLENAME<<<... (6-44)
firstName = firstName.replace(/'/g, '');
firstName = firstName.replace(/\./g, '');
firstName = firstName.replace(/[- ]/g, '<');
lastName = lastName.replace(/'/g, '');
lastName = lastName.replace(/[- ]/g, '<');
lastName = lastName.replace(/\./g, '');
// Removed apostrophes from the first name, eg O'Neil -> ONeil
// Replace spaces and hyphens with '<' in the first name, eg John Doe -> John<Doe
// TODO : Handle special cases like malaysia : no two filler characters like << for surname and givenname
// TODO : Verify rules for . in names. eg : J. Doe (Done same as apostrophe for now)
let arr = lastName + '<<' + firstName
if (arr.length > 39) {
arr = arr.substring(0, 39)
} else {
while (arr.length < 39) {
arr += '<'
}
}
let nameArr = stringToAsciiBigIntArray(arr)
return getNameLeaf(nameArr, i)
}
function processDob(day: string, month: string, year: string, i : number): bigint {
// YYMMDD
const monthMap: { [key: string]: string } = {
jan: "01",
feb: "02",
mar: "03",
apr: "04",
may: "05",
jun: "06",
jul: "07",
aug: "08",
sep: "09",
oct: "10",
nov: "11",
dec: "12"
};
month = monthMap[month.toLowerCase()];
year = year.slice(-2);
const dob = year + month + day;
let arr = stringToAsciiBigIntArray(dob);
return getDobLeaf(arr,i)
}
export function getPassportNumberLeaf(passport: (bigint|number)[], i?: number): bigint {
if (passport.length !== 9) {
console.log('parsed passport length is not 9:', i, passport)
return
}
try {
return poseidon9(passport)
} catch (err) {
console.log('err : passport', err, i, passport)
}
}
export function getNameDobLeaf(nameMrz : (bigint|number)[], dobMrz : (bigint|number)[], i? : number): bigint {
return poseidon2([getDobLeaf(dobMrz), getNameLeaf(nameMrz)])
}
export function getNameLeaf(nameMrz : (bigint|number)[] , i? : number ) : bigint {
let middleChunks: bigint[] = [];
let chunks: (number|bigint)[][] = [];
chunks.push(nameMrz.slice(0, 13), nameMrz.slice(13, 26), nameMrz.slice(26, 39)); // 39/3 for posedion to digest
for(const chunk of chunks){
middleChunks.push(poseidon13(chunk))
}
try {
return poseidon3(middleChunks)
} catch (err) {
console.log('err : Name', err, i, nameMrz)
}
}
export function getDobLeaf(dobMrz : (bigint|number)[], i? : number): bigint {
if (dobMrz.length !== 6) {
console.log('parsed dob length is not 9:', i, dobMrz)
return
}
try {
return poseidon6(dobMrz)
} catch (err) {
console.log('err : Dob', err, i, dobMrz)
}
}

View File

@@ -2,6 +2,7 @@ import { LeanIMT } from '@zk-kit/lean-imt';
import { sha256 } from 'js-sha256';
import { sha1 } from 'js-sha1';
import { sha384 } from 'js-sha512';
import { SMT } from '@ashpect/smt';
import forge from 'node-forge';
export function formatMrz(mrz: string) {
@@ -309,6 +310,56 @@ export function packBytes(unpacked) {
return packed;
}
export function generateSMTProof(smt: SMT, leaf: bigint) {
const {entry, matchingEntry, siblings, root, membership} = smt.createProof(leaf);
const depth = siblings.length
let closestleaf;
if (!matchingEntry){ // we got the 0 leaf or membership
// then check if entry[1] exists
if(!entry[1]){
// non membership proof
closestleaf = BigInt(0); // 0 leaf
} else {
closestleaf = BigInt(entry[0]); // leaf itself (memb proof)
}
} else {
// non membership proof
closestleaf = BigInt(matchingEntry[0]); // actual closest
}
// PATH, SIBLINGS manipulation as per binary tree in the circuit
siblings.reverse()
while(siblings.length < 256) siblings.push(BigInt(0));
// ----- Useful for debugging hence leaving as comments -----
// const binary = entry[0].toString(2)
// const bits = binary.slice(-depth);
// let indices = bits.padEnd(256, "0").split("").map(Number)
// const pathToMatch = num2Bits(256,BigInt(entry[0]))
// while(indices.length < 256) indices.push(0);
// // CALCULATED ROOT FOR TESTING
// // closestleaf, depth, siblings, indices, root : needed
// let calculatedNode = poseidon3([closestleaf,1,1]);
// console.log("Initial node while calculating",calculatedNode)
// console.log(smt.verifyProof(smt.createProof(leaf)))
// for (let i= 0; i < depth ; i++) {
// const childNodes: any = indices[i] ? [siblings[i], calculatedNode] : [calculatedNode, siblings[i]]
// console.log(indices[i],childNodes)
// calculatedNode = poseidon2(childNodes)
// }
// console.log("Actual node", root)
// console.log("calculated node", calculatedNode)
// -----------------------------------------------------------
return {
root,
depth,
closestleaf,
siblings,
};
}
export function generateMerkleProof(imt: LeanIMT, _index: number, maxDepth: number) {
const { siblings: merkleProofSiblings, index } = imt.generateProof(_index);
const depthForThisOne = merkleProofSiblings.length;
@@ -364,7 +415,6 @@ export function BigintToArray(n: number, k: number, x: bigint) {
return ret;
}
/**
* Converts a string of maximum 30 characters to a single BigInt.
* Each byte is represented by three digits in the resulting BigInt.
@@ -389,4 +439,41 @@ export function numberToString(num: bigint): string {
const str = num.toString().slice(1); // Remove leading '1'
const charCodes = str.match(/.{1,3}/g) || [];
return String.fromCharCode(...charCodes.map(code => parseInt(code, 10)));
}
export function stringToAsciiBigIntArray(str: string): bigint[] {
let asciiBigIntArray = [];
for (let i = 0; i < str.length; i++) {
asciiBigIntArray.push(BigInt(str.charCodeAt(i)));
}
return asciiBigIntArray;
}
export function hexToBin(n: string): string {
let bin = Number(`0x${n[0]}`).toString(2)
for (let i = 1; i < n.length; i += 1) {
bin += Number(`0x${n[i]}`).toString(2).padStart(4, "0")
}
return bin
}
export function num2Bits(n: number, inValue: bigint): bigint[] {
const out: bigint[] = new Array(n).fill(BigInt(0));
let lc1: bigint = BigInt(0);
let e2: bigint = BigInt(1);
for (let i = 0; i < n; i++) {
out[i] = (inValue >> BigInt(i)) & BigInt(1);
if (out[i] !== BigInt(0) && out[i] !== BigInt(1)) {
throw new Error("Bit value is not binary.");
}
lc1 += out[i] * e2;
e2 = e2 << BigInt(1);
}
if (lc1 !== inValue) {
throw new Error("Reconstructed value does not match the input.");
}
return out;
}

View File

@@ -2,6 +2,10 @@
# yarn lockfile v1
"@ashpect/smt@https://github.com/ashpect/smt#main":
version "1.0.0"
resolved "https://github.com/ashpect/smt#4f73fd24adb06a7f8efd6fd2d3ed58e9e2f2691a"
"@babel/runtime@^7.23.4":
version "7.23.4"
resolved "https://registry.yarnpkg.com/@babel/runtime/-/runtime-7.23.4.tgz#36fa1d2b36db873d25ec631dcc4923fdc1cf2e2e"
@@ -98,6 +102,11 @@ asn1.js@^5.4.1:
minimalistic-assert "^1.0.0"
safer-buffer "^2.1.0"
asynckit@^0.4.0:
version "0.4.0"
resolved "https://registry.yarnpkg.com/asynckit/-/asynckit-0.4.0.tgz#c79ed97f7f34cb8f2ba1bc9790bcc366474b4b79"
integrity sha512-Oei9OH4tRh0YqU3GxhX79dM/mwVgvbZJaSNaRk+bshkj0S5cfHcgYakreBjrHwatXKbz+IoIdYLxrKim2MjW0Q==
available-typed-arrays@^1.0.7:
version "1.0.7"
resolved "https://registry.yarnpkg.com/available-typed-arrays/-/available-typed-arrays-1.0.7.tgz#a5cc375d6a03c2efc87a553f3e0b1522def14846"
@@ -105,6 +114,15 @@ available-typed-arrays@^1.0.7:
dependencies:
possible-typed-array-names "^1.0.0"
axios@^1.7.2:
version "1.7.3"
resolved "https://registry.yarnpkg.com/axios/-/axios-1.7.3.tgz#a1125f2faf702bc8e8f2104ec3a76fab40257d85"
integrity sha512-Ar7ND9pU99eJ9GpoGQKhKf58GpUOgnzuaB7ueNQ5BMi0p+LZ5oaEnfF999fAArcTIBwXTCHAmGcHOZJaWPq9Nw==
dependencies:
follow-redirects "^1.15.6"
form-data "^4.0.0"
proxy-from-env "^1.1.0"
base64-js@^1.3.1:
version "1.5.1"
resolved "https://registry.yarnpkg.com/base64-js/-/base64-js-1.5.1.tgz#1b1b440160a5bf7ad40b650f095963481903930a"
@@ -139,6 +157,13 @@ call-bind@^1.0.0, call-bind@^1.0.2, call-bind@^1.0.5, call-bind@^1.0.6, call-bin
get-intrinsic "^1.2.4"
set-function-length "^1.2.1"
combined-stream@^1.0.8:
version "1.0.8"
resolved "https://registry.yarnpkg.com/combined-stream/-/combined-stream-1.0.8.tgz#c3d45a8b34fd730631a110a8a2520682b31d5a7f"
integrity sha512-FQN4MRfuJeHf7cBbBMJFXhKSDq+2kAArBlmRBvcvFE5BB1HZKXtSFASDhdlz9zOYwxh8lDdnvmMOe/+5cdoEdg==
dependencies:
delayed-stream "~1.0.0"
data-view-buffer@^1.0.1:
version "1.0.1"
resolved "https://registry.yarnpkg.com/data-view-buffer/-/data-view-buffer-1.0.1.tgz#8ea6326efec17a2e42620696e671d7d5a8bc66b2"
@@ -184,6 +209,11 @@ define-properties@^1.1.3, define-properties@^1.2.0, define-properties@^1.2.1:
has-property-descriptors "^1.0.0"
object-keys "^1.1.1"
delayed-stream@~1.0.0:
version "1.0.0"
resolved "https://registry.yarnpkg.com/delayed-stream/-/delayed-stream-1.0.0.tgz#df3ae199acadfb7d440aaae0b29e2272b24ec619"
integrity sha512-ZySD7Nf91aLB0RxL4KGrKHBXl7Eds1DAmEdcoVawXnLD7SDhpNgtuII2aAkg7a7QS41jxPSZ17p4VdGnMHk3MQ==
elliptic@^6.5.5:
version "6.5.5"
resolved "https://registry.yarnpkg.com/elliptic/-/elliptic-6.5.5.tgz#c715e09f78b6923977610d4c2346d6ce22e6dded"
@@ -306,6 +336,11 @@ es7-shim@^6.0.0:
string.prototype.trimleft "^2.0.0"
string.prototype.trimright "^2.0.0"
follow-redirects@^1.15.6:
version "1.15.6"
resolved "https://registry.yarnpkg.com/follow-redirects/-/follow-redirects-1.15.6.tgz#7f815c0cda4249c74ff09e95ef97c23b5fd0399b"
integrity sha512-wWN62YITEaOpSK584EZXJafH1AGpO8RVgElfkuXbTOrPX4fIfOyEpW/CsiNd8JdYrAoOvafRTOEnvsO++qCqFA==
for-each@^0.3.3:
version "0.3.3"
resolved "https://registry.yarnpkg.com/for-each/-/for-each-0.3.3.tgz#69b447e88a0a5d32c3e7084f3f1710034b21376e"
@@ -313,6 +348,15 @@ for-each@^0.3.3:
dependencies:
is-callable "^1.1.3"
form-data@^4.0.0:
version "4.0.0"
resolved "https://registry.yarnpkg.com/form-data/-/form-data-4.0.0.tgz#93919daeaf361ee529584b9b31664dc12c9fa452"
integrity sha512-ETEklSGi5t0QMZuiXoA/Q6vcnxcLQP5vdugSpuAyi6SVGi2clPPp+xgEhuMaHC+zGgn31Kd235W35f7Hykkaww==
dependencies:
asynckit "^0.4.0"
combined-stream "^1.0.8"
mime-types "^2.1.12"
fs@^0.0.1-security:
version "0.0.1-security"
resolved "https://registry.yarnpkg.com/fs/-/fs-0.0.1-security.tgz#8a7bd37186b6dddf3813f23858b57ecaaf5e41d4"
@@ -591,6 +635,18 @@ lodash@^4.17.10:
resolved "https://registry.yarnpkg.com/lodash/-/lodash-4.17.21.tgz#679591c564c3bffaae8454cf0b3df370c3d6911c"
integrity sha512-v2kDEe57lecTulaDIuNTPy3Ry4gLGJ6Z1O3vE1krgXZNrsQ+LFTGHVxVjcXPs17LhbZVGedAJv8XZ1tvj5FvSg==
mime-db@1.52.0:
version "1.52.0"
resolved "https://registry.yarnpkg.com/mime-db/-/mime-db-1.52.0.tgz#bbabcdc02859f4987301c856e3387ce5ec43bf70"
integrity sha512-sPU4uV7dYlvtWJxwwxHD0PuihVNiE7TyAbQ5SWxDCB9mUYvOgroQOwYQQOKPJ8CIbE+1ETVlOoK1UC2nU3gYvg==
mime-types@^2.1.12:
version "2.1.35"
resolved "https://registry.yarnpkg.com/mime-types/-/mime-types-2.1.35.tgz#381a871b62a734450660ae3deee44813f70d959a"
integrity sha512-ZDY+bPm5zTTF+YpCrAU9nK0UgICYPT0QtT1NZWFv4s++TNkcgVaT0g6+4R2uI4MjQjzysHB1zxuWL50hzaeXiw==
dependencies:
mime-db "1.52.0"
minimalistic-assert@^1.0.0, minimalistic-assert@^1.0.1:
version "1.0.1"
resolved "https://registry.yarnpkg.com/minimalistic-assert/-/minimalistic-assert-1.0.1.tgz#2e194de044626d4a10e7f7fbc00ce73e83e4d5c7"
@@ -684,6 +740,11 @@ process@^0.11.1:
resolved "https://registry.yarnpkg.com/process/-/process-0.11.10.tgz#7332300e840161bda3e69a1d1d91a7d4bc16f182"
integrity sha512-cdGef/drWFoydD1JsMzuFf8100nZl+GT+yacc2bEced5f9Rjk4z+WtFUTBu9PhOi9j/jfmBPu0mMEY4wIdAF8A==
proxy-from-env@^1.1.0:
version "1.1.0"
resolved "https://registry.yarnpkg.com/proxy-from-env/-/proxy-from-env-1.1.0.tgz#e102f16ca355424865755d2c9e8ea4f24d58c3e2"
integrity sha512-D+zkORCbA9f1tdWRK0RaCR3GPv50cMxcrz4X8k5LTSUD1Dkw47mKJEZQNunItRTkWwgtaUSo1RVFRIG9ZXiFYg==
regenerator-runtime@^0.14.0:
version "0.14.0"
resolved "https://registry.yarnpkg.com/regenerator-runtime/-/regenerator-runtime-0.14.0.tgz#5e19d68eb12d486f797e15a3c6a918f7cec5eb45"