[GAP Forum] Correction: How to read loops from 1 file.

Alexander Konovalov alexk at mcs.st-andrews.ac.uk
Mon Feb 13 08:25:37 GMT 2012


On 13 Feb 2012, at 05:21, Aasma Shaheen wrote:

> Loop is jst like group structure in which associativity does not hold.When
> i write a single loop let
> 0 1 2 3 4 5 6 7 8 9 10 11 12 13
> 1 0 3 2 5 4 8 9 6 7 12 13 10 11
> 2 3 1 0 6 10 11 5 12 13 7 4 9 8
> 3 2 0 1 11 7 4 10 13 12 5 6 8 9
> 4 5 6 11 1 0 12 3 9 10 2 8 13 7
> 5 4 10 7 0 1 2 13 11 8 9 3 6 12
> 6 8 11 4 12 2 13 0 10 1 3 9 7 5
> 7 9 5 10 3 13 0 12 1 11 8 2 4 6
> 8 6 13 12 9 11 10 1 5 0 4 7 2 3
> 9 7 12 13 10 8 1 11 0 4 6 5 3 2
> 10 12 7 5 2 9 3 8 4 6 13 0 11 1
> 11 13 4 6 8 3 9 2 7 5 0 12 1 10
> 12 10 8 9 13 6 7 4 3 2 11 1 5 0
> 13 11 9 8 7 12 5 6 2 3 1 10 0 4
> in a file "a1" and Load the package "loops" and then call it as
> 
> LoopFromFile("a1","Replace");
> 
> now hopefully you all will understand better.

This works well for me:

gap> LoadPackage("loops");
This version of LOOPS is ready for GAP 4.4.
    ======================================================
      LOOPS: Computing with quasigroups and loops in GAP  
                        version 2.0.0                     
             Gabor P. Nagy and Petr Vojtechovsky          
    ------------------------------------------------------
     contact:  nagyg at math.u-szeged.hu or petr at math.du.edu 
    ======================================================

true
gap> LoopFromFile("~/Desktop/a1.txt","Replace");
<loop of order 14>

So I do not understand which problem do you mean below. If
it's already not there, it's fine. If it's still there, please
describe it more precisely, otherwise it is difficult to help. 

By any chance, are you trying to read a different file with a 
very large loop beyond the loops package capacity? What precisely
runs for a very long time?

Alexander


> My problem is still there :).
> But i have calculated somehow by very very long method..but a computer
> programm should read it in minutes,but unfortunately  GAP is not doing
> this.




More information about the Forum mailing list