Submitted by kegzilla t3_yk5kgj in singularity
visarga t1_iusk21l wrote
Reply to comment by ProShortKingAction in Robots That Write Their Own Code by kegzilla
They do a few preventive measures.
> we first check that it is safe to run by ensuring there are no import statements, special variables that begin with __, or calls to exec and eval. Then, we call Python’s exec function with the code as the input string and two dictionaries that form the scope of that code execution: (i) globals, containing all APIs that the generated code might call, and (ii) locals, an empty dictionary which will be populated with variables and new functions defined during exec. If the LMP is expected to return a value, we obtain it from locals after exec finishes.
ProShortKingAction t1_iuskyif wrote
This seems to be saying "safe to run" as in make it less likely to crash not as in prevent cybersecurity issues.
visarga t1_iuvrqym wrote
it prevents access to various Python APIs, exec and eval
it's just a basic check
Viewing a single comment thread. View all comments