From c8082f1e657af6a680b2d2093c7452e77745a8a5 Mon Sep 17 00:00:00 2001 From: Ben Date: Fri, 24 Jun 2022 13:43:51 +0100 Subject: [PATCH] tidy repo --- README.rst | 45 +++++++++++++++++-------------------- data/{v1 => v0.2}/112.sobj | Bin data/{v1 => v0.2}/128.sobj | Bin data/{v1 => v0.2}/144.sobj | Bin data/{v1 => v0.2}/160.sobj | Bin data/{v1 => v0.2}/176.sobj | Bin data/{v1 => v0.2}/192.sobj | Bin data/{v1 => v0.2}/256.sobj | Bin data/{v1 => v0.2}/80.sobj | Bin data/{v1 => v0.2}/96.sobj | Bin 10 files changed, 21 insertions(+), 24 deletions(-) rename data/{v1 => v0.2}/112.sobj (100%) rename data/{v1 => v0.2}/128.sobj (100%) rename data/{v1 => v0.2}/144.sobj (100%) rename data/{v1 => v0.2}/160.sobj (100%) rename data/{v1 => v0.2}/176.sobj (100%) rename data/{v1 => v0.2}/192.sobj (100%) rename data/{v1 => v0.2}/256.sobj (100%) rename data/{v1 => v0.2}/80.sobj (100%) rename data/{v1 => v0.2}/96.sobj (100%) diff --git a/README.rst b/README.rst index b194a4561..017ab8e77 100644 --- a/README.rst +++ b/README.rst @@ -3,12 +3,12 @@ Parameter curves for Concrete This Github repository contains the code needed to generate the Parameter curves used inside Zama. The repository contains the following files: -- cpp/, Python scripts to generate a cpp file containing the parameter curves +- cpp/, Python scripts to generate a cpp file containing the parameter curves (needs updating) - data/, a folder containing the data generated for previous curves. -- estimator/, Zama's internal version of the LWE Estimator -- figs/, a folder containing various figures related to the parameter curves -- scripts.py, a copy of all scripts required to generate the parameter curves -- a variety of other python files, used for estimating the security of previous Concrete parameter sets +- estimator_new/, the Lattice estimator (TODO: add as a submodule and use dependabot to alert for new commits) +- old_files/, legacy files used for previous versions +- generate_data.py, functions to gather raw data from the lattice estimator +- verifiy_curves.py, functions to generate and verify curves from raw data .. image:: logo.svg :align: center @@ -20,13 +20,16 @@ Example This is an example of how to generate the parameter curves, and save them to file. :: - - sage: load("scripts.py") - sage: results = get_zama_curves() - sage: save(results, "v0.sobj") + ./job.sh :: -We can load results files, and find the interpolants. +This will generate several data files, {80, 96, 112, 128, 144, 160, 176, 192, 256}.sobj + +To generate the parameter curves from the data files, we run + +`sage verify_curves.py` + +this will generate a list of the form: :: @@ -41,22 +44,15 @@ We can load results files, and find the interpolants. (-0.014606812351714953, 3.8493629234693003, 256, 'PASS', 826)] :: -Finding the value of n_{alpha} is done manually. We can also verify the interpolants which are generated at the same time: +each element is a tuple (a, b, security, P, n_min), where (a,b) are the model +parameters, security is the security level, P is a boolean value denoting PASS or +FAIL of the verification, and n_min is the smallest reccomended value of `n` to be used. -:: +Each model outputs a value of sigma, and is of the form: - # verify the interpolant used for lambda = 256 (which is interps[-1]) - sage: z = verify_interpolants(interps[-1], (128,2048), 64) - [... code runs, can take ~10 mins ...] - # find the index corresponding to n_alpha, which is where security drops below the target security level (256 here) - sage: n_alpha = find_nalpha(z, 256) - 653 - - # so the model in this case is - (-0.014327640360322604, 2.899270827311096, 653) - # which corresponds to - # sd(n) = max(-0.014327640360322604 * n + 2.899270827311096, -logq + 2), n >= 653 -:: +`f(a, b, n) = max(ceil(a * n + b), -log2(q) + 2)` + +where the -log2(q) + 2 term ensures that we are always using at least two bits of noise. Version History ------------------- @@ -67,6 +63,7 @@ Data for the curves are kept in /data. The following files are present: v0: generated using the {usvp, dual, decoding} attacks v0.1: generated using the {mitm, usvp, dual, decoding} attacks + v0.2: generated using the lattice estimator :: TODO List diff --git a/data/v1/112.sobj b/data/v0.2/112.sobj similarity index 100% rename from data/v1/112.sobj rename to data/v0.2/112.sobj diff --git a/data/v1/128.sobj b/data/v0.2/128.sobj similarity index 100% rename from data/v1/128.sobj rename to data/v0.2/128.sobj diff --git a/data/v1/144.sobj b/data/v0.2/144.sobj similarity index 100% rename from data/v1/144.sobj rename to data/v0.2/144.sobj diff --git a/data/v1/160.sobj b/data/v0.2/160.sobj similarity index 100% rename from data/v1/160.sobj rename to data/v0.2/160.sobj diff --git a/data/v1/176.sobj b/data/v0.2/176.sobj similarity index 100% rename from data/v1/176.sobj rename to data/v0.2/176.sobj diff --git a/data/v1/192.sobj b/data/v0.2/192.sobj similarity index 100% rename from data/v1/192.sobj rename to data/v0.2/192.sobj diff --git a/data/v1/256.sobj b/data/v0.2/256.sobj similarity index 100% rename from data/v1/256.sobj rename to data/v0.2/256.sobj diff --git a/data/v1/80.sobj b/data/v0.2/80.sobj similarity index 100% rename from data/v1/80.sobj rename to data/v0.2/80.sobj diff --git a/data/v1/96.sobj b/data/v0.2/96.sobj similarity index 100% rename from data/v1/96.sobj rename to data/v0.2/96.sobj