Replit exit status -1

Question:

So I run my polynomial feature ridge regression code on replit but i am met with exit status -1 all the time. Sometimes it work like (once every 50 runs) Others being 99% of the time it doesnt , is this because of resources ? Mine always say maxed.
I only bring this up because I run it on a juypter notebook and it runs perfectly fine with the same result as the rpelit one that comes up ever 50-100 runs?

Thanks

Repl link/Link to where the bug appears:
https://replit.com/@109967496141109/0601#main.py

Screenshots, links, or other helpful context:

Juypter note book with result of 980… same as replit one if it ever comes up lol

Debug: x_scaled = [[-0.52510555 -0.71396941 -0.71396941 -0.71396941 -0.03329021 -0.05088924
  -0.09046836 -0.13569867 -0.01921301 -0.07206163 -0.09046836 -0.08177089
  -0.01921301 -0.06942745 -0.01921301 -0.05441296 -0.10757082 -0.10931247
  -0.05088924 -0.03844732 -0.04710557 -0.04710557 -0.11271881 -0.02717632
  -0.04299336 -0.01921301  0.         -0.01921301 -0.05088924  0.
  -0.02717632 -0.01921301  0.         -0.03844732 -0.12994442 -0.03329021
  -0.08622612 -0.05772434 -0.0638403  -0.03844732 -0.16402479 -0.05088924
  -0.05772434 -0.03329021 -0.01921301 -0.07206163 -0.01921301 -0.03844732
  -0.04710557 -0.06669137 -0.08837191 -0.02717632 -0.19976963 -0.03844732
  -0.02717632 -0.05441296 -0.08622612 -0.08177089 -0.10217617 -0.03329021
  -0.11271881 -0.01921301 -0.05772434 -0.04299336 -0.1542742  -0.06085806
  -0.07706571  0.         -0.14123494 -0.01921301 -0.02717632 -0.07945227
  -0.07460471 -0.01921301 -0.01921301 -0.01921301 -0.03844732 -0.03329021
  -0.02717632 -0.01921301 -0.03329021 -0.01921301 -0.01921301 -0.01921301
  -0.12697632  3.70516678 -0.153016   -0.1542742  -0.12394176 -0.18955928
  -0.07706571 -0.13428163 -0.2307493  -0.146578   -0.04299336 -0.06085806
  -0.04299336 -0.09649346 -0.06942745 -0.11271881 -0.1143858  -0.03329021
  -0.05772434 -0.07945227 -0.13849239 -0.13428163 -0.09046836 -0.20666092
  -0.12546764 -0.0638403  -0.02717632 -0.0638403  -0.11925417 -0.03329021
  -0.01921301 -0.01921301 -0.01921301 -0.01921301 -0.03329021 -0.06085806
   0.         -0.01921301 -0.04299336 -0.08402721 -0.10931247 -0.05441296
  -0.10400413 -0.02717632 -0.01921301 -0.08177089 -0.07460471 -0.04299336
  -0.06669137 -0.04710557 -0.0638403  -0.07706571 -0.05441296 -0.1143858
  -0.07706571 -0.153016   -0.11271881 -0.10580184 -0.02717632 -0.13569867
  -0.11271881 -0.02717632 -0.01921301 -0.03329021 -0.02717632  0.
   0.         -0.02717632 -0.03329021 -0.05772434 -0.02717632 -0.01921301]]
Debug: X_train_scaled[0] = [ 0.49739726  0.59868805  0.59868805  0.59868805 -0.03329021 -0.05088924
 -0.09046836 -0.13569867 -0.01921301 -0.07206163 -0.09046836 -0.08177089
 -0.01921301 -0.06942745 -0.01921301 -0.05441296 -0.10757082 -0.10931247
 -0.05088924 -0.03844732 -0.04710557 -0.04710557 -0.11271881 -0.02717632
 -0.04299336 -0.01921301  0.         -0.01921301 -0.05088924  0.
 -0.02717632 -0.01921301  0.         -0.03844732 -0.12994442 -0.03329021
 -0.08622612 -0.05772434 -0.0638403  -0.03844732 -0.16402479 -0.05088924
 -0.05772434 -0.03329021 -0.01921301 -0.07206163 -0.01921301 -0.03844732
 -0.04710557 -0.06669137 -0.08837191 -0.02717632 -0.19976963 -0.03844732
 -0.02717632 -0.05441296 -0.08622612 -0.08177089 -0.10217617 -0.03329021
 -0.11271881 -0.01921301 -0.05772434 -0.04299336 -0.1542742  -0.06085806
 -0.07706571  0.         -0.14123494 -0.01921301 -0.02717632 -0.07945227
 -0.07460471 -0.01921301 -0.01921301 -0.01921301 -0.03844732 -0.03329021
 -0.02717632 -0.01921301 -0.03329021 -0.01921301 -0.01921301 -0.01921301
 -0.12697632 -0.26989338 -0.153016   -0.1542742  -0.12394176  5.27539451
 -0.07706571 -0.13428163 -0.2307493  -0.146578   -0.04299336 -0.06085806
 -0.04299336 -0.09649346 -0.06942745 -0.11271881 -0.1143858  -0.03329021
 -0.05772434 -0.07945227 -0.13849239 -0.13428163 -0.09046836 -0.20666092
 -0.12546764 -0.0638403  -0.02717632 -0.0638403  -0.11925417 -0.03329021
 -0.01921301 -0.01921301 -0.01921301 -0.01921301 -0.03329021 -0.06085806
  0.         -0.01921301 -0.04299336 -0.08402721 -0.10931247 -0.05441296
 -0.10400413 -0.02717632 -0.01921301 -0.08177089 -0.07460471 -0.04299336
 -0.06669137 -0.04710557 -0.0638403  -0.07706571 -0.05441296 -0.1143858
 -0.07706571 -0.153016   -0.11271881 -0.10580184 -0.02717632 -0.13569867
 -0.11271881 -0.02717632 -0.01921301 -0.03329021 -0.02717632  0.
  0.         -0.02717632 -0.03329021 -0.05772434 -0.02717632 -0.01921301]
Example Prediction: £980,688.31
/usr/local/lib/python3.10/dist-packages/sklearn/base.py:439: UserWarning: X does not have valid feature names, but StandardScaler was fitted with feature names
  warnings.warn(

Replit automatically kills the process if it runs out of memory, with exit status -1.