![]() |
||
|
It's Election Day!November 4, 2025
Time is Running Out: Superintelligent AI is ComingMore than 700 scientists, policymakers, and experts in various fields have signed their names to a letter asking for a halt on the development of Superintelligent AI until there are assurances that it can be done safely and the public supports it. AI Safety is an important field but lags behind the development and application of AI systems. It seems unlikely that tech companies will pause or substantially slow their development despite the existence of this letter, as there have been many such calls in the past. For the uninitiated, Superintelligent AI (or Artificial Super Intelligence - ASI) is the pinnacle of artificial intelligence development that surpasses human intelligence in all areas. Its abilities in cognition, problem-solving, and self-improvement are leaps and bounds beyond our current technologies, and it stands to significantly impact every human endeavor. Many believe the development of ASI is imminent, with estimates ranging from as far away as decades to within the next 5 years! It's going to take an act of Congress to boost research into AI Safety through incentives and, potentially, put guardrails in place governing safety thresholds in certain areas of public life. This letter should serve as a warning to every lawmaker at all levels or government, we cannot afford to be unprepared for what's coming. |
||
|
|