Memory leak in glfwGetRequiredInstanceExt

This minimal glfw program leaks memory according to valgrind.
But returned pointer is not allocated, attempt to free will trigger double free/corruption in glibc.

uint32_t n;
const char** glfw_extension_list = glfwGetRequiredInstanceExtensions(&n);

==408378== HEAP SUMMARY:
==408378== in use at exit: 278,892 bytes in 2,880 blocks
==408378== total heap usage: 24,344 allocs, 21,464 frees, 451,741,092 bytes allocated
==408378== LEAK SUMMARY:
==408378== definitely lost: 1,120 bytes in 20 blocks
==408378== indirectly lost: 2,818 bytes in 63 blocks
==408378== possibly lost: 0 bytes in 0 blocks
==408378== still reachable: 274,954 bytes in 2,797 blocks
==408378== suppressed: 0 bytes in 0 blocks

It is kind of not cool because I can’t tell my leaks from glfw leaks.

The function glfwGetRequiredInstanceExtensions() does not allocate any memory, it simply returns the address of the strings stored in the _GLFWlibrary _glfw variable.

It is possible that glfwInit(), glfwTerminate() are leaking, but it’s also possible that any of the libraries that these use have a leak.

Trivial glfwInit/glfwTerminate doesn’t leak:
==3253== definitely lost: 0 bytes in 0 blocks

Trivial glfwInit/glfwGetRequiredInstanceExtensions/glfwTerminate program leaks:
==3293== definitely lost: 1,120 bytes in 20 blocks

Stepping through the code I missed that glfwGetRequiredInstanceExtensions() calls _glfwInitVulkan() which loads the vulkan library dynamically, and then queries extensions.

Checking through the code on Win32 the GLFW library itself does not leak (most of the code here is cross platform so I assume this is also the case on other platforms) but the Vulkan library or GPU driver libraries might.