1 / 29

Evolving Boolean Functions Satisfying Multiple Criteria

Evolving Boolean Functions Satisfying Multiple Criteria. John A Clark , Jeremy L Jacob and Susan Stepney (University of York,UK) Subhamoy Maitra (Indian Statistical Institute,Kolcatta,India) William Millan (SRC Queensland University of Technology,Brisbane, Australia). Overview.

Download Presentation

Evolving Boolean Functions Satisfying Multiple Criteria

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evolving Boolean Functions Satisfying Multiple Criteria John A Clark, Jeremy L Jacob and Susan Stepney (University of York,UK) Subhamoy Maitra (Indian Statistical Institute,Kolcatta,India)William Millan (SRC Queensland University of Technology,Brisbane, Australia)

  2. Overview • Optimisation • Boolean function design • Underpinning approach. • Correlation immunity • Linear change of basis • Higher-order immunity via change of basis. • Propagation criteria. • Conclusions and future work.

  3. Optimisation • Subject of huge practical importance. An optimisation problem may be stated as follows: • Find the value x that maximises the function z(y) over D. • Example: maximise z(x)=-x2+8x-12, over x=0…100. Can use calculus to give us x=4 as the answer with z(x)=4. Given a domain D and a function z: D   find xinD such that z(x)=sup{z(y): y in D}

  4. Local Optimisation - Hill Climbing • Let the current solution be x. • Define the neighbourhood N(x) to be the set of solutions that are ‘close to x’ • If possible, move to a neighbouring solution that improvesthe value of z(x), otherwise stop. • Choose any y as next solution provided z(y) >= z(x) • loose hill-climbing • Choose y as next solution such that z(y)=sup{z(v): v in N(x)} • steepest gradient ascent

  5. x0 x1 x2 x3 Local Optimisation - Hill Climbing z(x) Really want toobtain xopt Neighbourhood of a point x might be N(x)={x+1,x-1}Hill-climb goes x0 x1 x2 since z(x0)<z(x1)<z(x2) > z(x3) and gets stuck at x2 (local optimum) xopt

  6. x0 x1 x2 x4 x5 x6 x7 x8 x9 x10 x11 x12 x13 Simulated Annealing Allows non-improving moves so that it is possible to go down z(x) in order to rise again to reach globaloptimum x Details of annealing are not that important for this talk – other global optimisation techniques could be used – but annealing has proved very effective.

  7. What’s the paper about? • There are many desirable properties for a Boolean functions in cryptography: balance, high non-linearity, low autocorrelation, high algebraic degree, correlation immunity of reasonable order, propagation immunity etc. • The paper seeks to convince you of the following: • Optimisation is a flexible tool for the design of Boolean functions with multiple desirable properties. • We will consider two types of search domains: • D= balanced Boolean functions; and • D=sets of vectors that are Walsh (Autocorrelation) zeroes

  8. 0 0 0 1 -1 0 0 0 1 0 1 1 0 1 0 0 1 2 0 1 1 0 1 3 1 0 0 1 -1 4 1 0 1 0 1 5 1 1 0 1 -1 6 1 1 1 1 -1 7 Boolean Function Design • A Boolean function x f(x) f(x) For present purposes we shall use the polar representation Will talk only about balanced functions where there are equal numbers of 1s and -1s.

  9. Lw(x) Lw(x)=(-1) Preliminary Definitions • Definitions relating to a Boolean function f of n variables Lw(x)=w1x1… wnxn Linear function (polar form) Walsh Hadamard

  10. Preliminary Definitions • Non-linearity • Auto-correlation • For present purposes we need simply note that these can be easily evaluated given a function f. They can therefore be used as the functions to be optimised. Traditionally they are.

  11. c a b Basic Functions Using Parseval’s Theorem • Parseval’s Theorem • Loosely, push down on F(w)2 for some particular w and it appears elsewhere. • Suggests that arranging for uniform values of F(w)2 will lead to good non-linearity. (Bent functions achieve this but we are concerned with balanced functions.) This is the initial motivation for our new cost function family Pythagoras: a2+b2=c2 NEW FUNCTION!

  12. 0 0 0 0 0 0 1 1 0 1 0 2 0 1 1 3 1 0 0 4 1 0 1 5 1 1 0 6 1 1 1 7 Moves Preserving Balance • Start with balanced (but otherwise random) solution. Move strategy preserves balance (Millan et al) x f(x) f(x) g(x) Neighbourhood of a particular function f is the set of all functions obtained byexchanging (flipping) any two dissimilar values. Here we have swapped f(2) and f(4) 1 -1 -1 0 1 1 0 1 -1 0 1 1 1 -1 1 0 1 1 1 -1 -1 1 -1 -1 Note that neighbouring functions have close non-linearity and autocorrelation – some degree of continuity.

  13. Simple Hill Climbing Result • Even simple hill-climbing can be used to good effect. • By perturbing a 15 variable balanced Boolean function of non-linearity 16262 (obtained by modifying Patterson-Wiedemann functions) and hill-climbing we were able to obtain a non-linearity of 16264 (best known non-linearity so far for 15 variable balanced functions)

  14. Getting in the Right Area • Actually minimising this cost function family doesn’t give good results! • But – it is very good at getting in the right area. • Method is: • Using simulated annealing minimise the cost function given (for given parameter values of X and R). Let the resulting function be fsa • Now hill-climb with respect to non-linearity (Nonlinearity Targeted technique - NLT); OR…. • Now hill-climb with respect to autocorrelation (Autocorrelation Targeted technique - ACT)

  15. Best Profiles NLT ACT (n,degree,nonlinearity,autocorrelation)

  16. Autocorrelation-related results • In 1995 Zheng and Zhang introduced the two global avalanche criteria (autocorrelation and sum-of-squares). Autocorrelation bounds now receiving more attention. Autocorrelation results Best construction results due to Maitra. For n=8 both techniques (NLT and ACT) achieve lower autocorrelation than that by any previous construction or conjecture.

  17. Sum of Squares Conjectures • Zheng and Zhang introduced sum-of-squares: Use sf as cost function. Oddly, earlier functions actually gave better results!

  18. Correlation Immunity- Direct Method See to punish lack of correlation immunity and low non-linearity Sub-optimal

  19. Linear Transformation for CI(1) Let WZf be the set of Walsh zeroes of the function f If Rank(WZf)=n then form the matrix Bf whose rows are linearly independent vectors from WZf. Let Cf=Bf-1 and let f’(x)=f(Cf x) Resulting function f’ has same nonlinearity and algebraic degree and is also CI(1). Can apply this method to basic functions generated earlier. Method used earlier by Maitra and Pasalic

  20. Best Profiles Overall(direct and direct plus change of basis) Some previous bests: (6,1,2,24,64) (7,1,5,56,64) [Sarkar and Maitra, 2000] (8,1,6,116,80) [Maitra and Pasalic,2002] (7,2,4,56) [ Pasalic Maitra Johansson and Sarkar,2000] (8,1,6,116,24) seems very good, no (8,0,*,116,16) yet discovered. Optimal non-linearity. Typically very low autocorrelation values

  21. Generalising to Higher Order Immunity • Basis transformation can achieve higher order immunity functions too. • Need to find subset of the Walsh zeroessuch that for any k elements (1<=k<=m) wi1 , wi2 ,…, wiksums to a Walsh zero

  22. Generalising to Higher Order Immunity • Consider an initial permutation pwz of the Walsh zeroesWe will view the first n elements of a permutation as a candidate basisHow should we punish deviation from requirements?

  23. Generalising to Higher Order Immunity • By punishing lack of suitable rank and punishing relevant sums not being Walsh zeroes. • For example for m=2 we can define the number of misses as the number of two-fold sums that are not Walsh zeroes • Cost function is

  24. Generalising to Higher Order Immunity • This approach has allowed basis sets to be evolved with second order correlation immunity (e.g. some direct attempts to achieve (7,2,4,56) failed had required degree and non-linearity but were not CI(2). • Basis transformations allowed (7,2,4,56) to be attained. • Seems difficult to attain bases which give CI(3) but attempts are currently under way.

  25. Transforming for Propagation Criteria • Change of basis approaches can also be applied to attain PC(k)-ness. • Essentially now work with autocorrelation zeroes. • Only a small amount of work has been done on this but results are encouraging: • Can use linear transform on (8,0,6,116,24) derived earlier to attain (8,0,6,116,24) with PC(1). • Also possible to transform for higher order PC(k) in much the same fashion as before (but now we have autocorrelation misses).

  26. Transforming for Propagation Criteria • Have tried this on earlier functions to seek out bases of autocorrelation zeroes to give PC(2) functions. • Prior to 1997 the highest algebraic degree achieved for a PC(2) function was n/2 (for Bent functions). • Satoh et al [1998] gave constructions on n=L+2L-1 input bits with algebraic degree n-L-1 (and similar for balanced functions). They note that deg(f) <=n-1 gives a trivial upper bound on degree. • Searches for 2nd-order change of basis reveals an earlier function on 6 variables which is PC(2) with degree 5. Support=c65b4d405ceb91f1

  27. PC(k) and CI(m) Together Can use a cost function that punishes lack of PC(k)-ness, lack of CI(k)-ness and low non-linearity

  28. Conclusions • Optimisation is a very useful tool for Boolean function design and exploration. • Have generated functions with excellent profiles over several criteria. The method would seem extensible. • Basic functions have very special properties. • Theory helps! Change of basis very useful.

  29. Further Work • Spectrum based approaches –some work already completed. • Planting trapdoors! Who says you have to be honest about the cost function used. We said the method is extensible – there is nothing to stop it being maliciously extended! • Some work on S-box generalisations completed. • More on PC(k)CI(m) – very little attempted so far. • Extend work on basis for higher order immunities. • Other work on metaheuristic search and protocols, block cipher and public key cryptanalysis.

More Related