oath_base32_decode(3) | liboath | oath_base32_decode(3) |
oath_base32_decode - API function
#include <oath.h>
int oath_base32_decode(const char * in, size_t inlen, char ** out, size_t * outlen);
Decode a base32 encoded string into binary data.
Space characters are ignored and pad characters are added if needed. Non-base32 data are not ignored but instead will lead to an OATH_INVALID_BASE32 error.
The in parameter should contain inlen bytes of base32 encoded data. The function allocates a new string in *out to hold the decoded data, and sets *outlen to the length of the data.
If out is NULL, then *outlen will be set to what would have been the length of *out on successful encoding.
If the caller is not interested in knowing the length of the output data out, then outlen may be set to NULL.
It is permitted but useless to have both out and outlen NULL.
On success OATH_OK (zero) is returned, OATH_INVALID_BASE32 is returned if the input contains non-base32 characters, and OATH_MALLOC_ERROR is returned on memory allocation errors.
1.12.0
Report bugs to <oath-toolkit-help@nongnu.org>. liboath home page: http://www.gnu.org/software/liboath/ General help using GNU software: http://www.gnu.org/gethelp/
Copyright © 2009-2020 Simon Josefsson.
Copying and distribution of this file, with or without modification, are
permitted in any medium without royalty provided the copyright notice and
this notice are preserved.
2.6.6 | liboath |