Sorry for not updating this blog for the last couple of weeks. I forgot about this after the semester began. I'll use this blog post to summarize the project progress in this month. Before that, however, I'd like to note that the current module does indeed perform LLL, LLL with removals and ULLL which was the primary goal of the project.
So, starting off where I left in the previous post, the augmentation of the identity to the basis matrix actually plays a part in bettering the numerical stability of ULLL. Hence, I modified the ULLL function to be similar to the one in flint-1.6.
Also, the default LLL functions (which are essentially wrappers for ULLL) were added. The LLL-reducedness checkers were also changed to implement the certification method introduced by Villard. It will not always work, but if it does the result is guaranteed. It is efficient, and much faster than using the provable precision in the L^2 algorithm or using exact arithmetic for testing.
The Gram matrix version of the provable is_reduced functions was also added.
This was followed by an implementation of a modified LLL reduction algorithm due to Storjohann which has better complexity in terms of the lattice dimension. However, it didn't fare well against the other LLL implementations (specifically classical LLL) during profiling. The mpf versions of the is_reduced functions were removed because they weren't proven and mpfr versions were written.
Adding a function which uses mpf and epsilons to emulate the behaviour of changing the rounding mode is a possibility as mentioned on flint-devel. Finally, the documentation was updated and now should (hopefully) explain about all the strategies and types used in this module (along with the function descriptions of course).
I'd like to thank my mentors, William Hart and Curtis Bright, for their valuable time, support and patience. This project wouldn't have been possible without their help. I'd also like to thank Fredrik Johansson for reviewing some of my code and Burcin Erocal from the lmonade organization for his advice. I'm grateful to the Google OSPO team for running the GSoC program. This has been a great summer and contributing to FLINT has been a tremendous learning experience for me.
So, starting off where I left in the previous post, the augmentation of the identity to the basis matrix actually plays a part in bettering the numerical stability of ULLL. Hence, I modified the ULLL function to be similar to the one in flint-1.6.
Also, the default LLL functions (which are essentially wrappers for ULLL) were added. The LLL-reducedness checkers were also changed to implement the certification method introduced by Villard. It will not always work, but if it does the result is guaranteed. It is efficient, and much faster than using the provable precision in the L^2 algorithm or using exact arithmetic for testing.
The Gram matrix version of the provable is_reduced functions was also added.
This was followed by an implementation of a modified LLL reduction algorithm due to Storjohann which has better complexity in terms of the lattice dimension. However, it didn't fare well against the other LLL implementations (specifically classical LLL) during profiling. The mpf versions of the is_reduced functions were removed because they weren't proven and mpfr versions were written.
Adding a function which uses mpf and epsilons to emulate the behaviour of changing the rounding mode is a possibility as mentioned on flint-devel. Finally, the documentation was updated and now should (hopefully) explain about all the strategies and types used in this module (along with the function descriptions of course).
I'd like to thank my mentors, William Hart and Curtis Bright, for their valuable time, support and patience. This project wouldn't have been possible without their help. I'd also like to thank Fredrik Johansson for reviewing some of my code and Burcin Erocal from the lmonade organization for his advice. I'm grateful to the Google OSPO team for running the GSoC program. This has been a great summer and contributing to FLINT has been a tremendous learning experience for me.