Issue
I have a dataframe with the dtypes shown below and I want to insert the dataframe into a postgres DB but it fails due to error can't adapt type 'numpy.int64'
id_code int64
sector object
created_date float64
updated_date float64
How can I convert these types to native python types such as from int64 (which is essentially 'numpy.int64') to a classic int that would then be acceptable to postgres via the psycopg2 client.
data['id_code'].astype(np.int) defaults to int64
It is nonetheless possible to convert from one numpy type to another (e.g from int to float)
data['id_code'].astype(float)
changes to
dtype: float64
The bottomline is that psycopg2 doesn't seem to understand numpy datatypes if any one has ideas how to convert them to classic types that would be helpful.
Updated: Insertion to DB
def insert_many():
"""Add data to the table."""
sql_query = """INSERT INTO classification(
id_code, sector, created_date, updated_date)
VALUES (%s, %s, %s, %s);"""
data = pd.read_excel(fh, sheet_name=sheetname)
data_list = list(data.to_records())
conn = None
try:
conn = psycopg2.connect(db)
cur = conn.cursor()
cur.executemany(sql_query, data_list)
conn.commit()
cur.close()
except(Exception, psycopg2.DatabaseError) as error:
print(error)
finally:
if conn is not None:
conn.close()
Solution
I'm not sure why your data_list contains NumPy data types, but the same thing happens to me when I run your code. Here is an alternative way to construct data_list that so that integers and floats end up as their native python types:
data_list = [list(row) for row in data.itertuples(index=False)]
Alternate approach
I think you could accomplish the same thing in fewer lines of code by using pandas to_sql:
import sqlalchemy
import pandas as pd
data = pd.read_excel(fh, sheet_name=sheetname)
engine = sqlalchemy.create_engine("postgresql://username@hostname/dbname")
data.to_sql(engine, 'classification', if_exists='append', index=False)
Answered By - Michael
0 comments:
Post a Comment
Note: Only a member of this blog may post a comment.