Lucky
I mean the IdentityMap could be a WeakRef, so if an entity it isn't used anywhere else, *poof*, It's gone
Lucky
I mean the IdentityMap could be a WeakRef, so if an entity it isn't used anywhere else, *poof*, It's gone
Maybe for the current session that should not happen, but for old sessions, if the object isn't stored in any other variable, it won't be used ever again
Lucky
Therefore it could get garbage collected
Carel
You could call it “foundit”, like when you loose your car keys ?
Lucky
Lol
Alexander
We need to invalidate objects, because they may be updated inside the database by another concurrent processes. When working with database we use transactions. Each transaction has independent "world view" and work with the database state as if nobody else change the database at the same time. After transaction is finished, we cannot assume that the objects are in the same state, because they may be also updated by concurrent transactions. There is no easy way to listen database and invalidate all concurrently updated objects in memory. We can have a global long-lived cache which eliminates the need of db_session, that is basically what interactive mode does. But in this case you will encounter one day that you are working with obsolete data. Also, without db_session your global cache will load indefinitely and will not be garbage collected db_session implements standard architecture patterns Unit Of Work https://martinfowler.com/eaaCatalog/unitOfWork.html and Indentity Map https://martinfowler.com/eaaCatalog/identityMap.html and provide a convenient way of implicit working with session cache without the necessity to do session_cache.attach(obj)
Alexander
Unfortunately IndentityMap cannot be a weakref, because objects in IdentityMap are interlinked. Consider the following example: with db_session: s1 = Student[123] for s in s.group.students: print(s.name) Here group.students holds references to all students of the group, and each student.group refers to g object As we hold a reference to one student, we indirectly hold all other objects too, via circular references. If we use weakref for circular references, the group and all students beside s1 will be garbage collected immediately. So we need to use regular references in IdentityMap instead of weakrefs
Alexander
> but for old sessions, if the object isn't stored in any other variable, it won't be used ever again It is not correct, as object can be accessible via relationships: s1.group.students
Lucky
hmmm
Lucky
It would help replying to you if you'd hit enter after paragraphs so I can reply to them individually.
Alexander
Ok
Lucky
maybe you can just copy paste the text, if you want to write it first
Lucky
> As we hold a reference to one student, we indirectly hold all other objects too, via circular references. If we use weakref for circular references, the group and all students beside s1 will be garbage collected immediately. So we need to use regular references in IdentityMap instead of weakrefs Can we make just the reverse elements weak? Like | Student | Group | ---------------------- | Günter |  12 | | Herbert |  12 | | Amadeus | 43 | So we could have Student.group normal, but Group.students weak?
Lucky
That would be the way we would need to load them from the database anyway, I imagine
Lucky
If you load Group, it wouldn't load all .students
Lucky
but with a Student, the Group would be non-weak, and don't get deleted.
Lucky
Also, for the current session, all entities would be stored non-weak Indentity Map, only when an new session is started they get transfaired into a weak global version
Alexander
> Can we make just the reverse elements weak? > So we could have Student.group normal, but Group.students weak? If a user does: g = Group[123] for s in g.students: print(s.name) he want to be able to traverse relationship from group to student, and to do this group.students should contain a set of students, which will not be garbage-collected immediately
Alexander
If you load Group, it wouldn't load all .students
But if you access group.students, you do
Lucky
Hmmm. Damnit
Lucky
How are those garbage collected anyway?
Alexander
After db_session is over, it disconnect objects from itself, but they remain interlinked with each other via references. If there is a reference to at least one object, it hold all interlnked objects. After all references from the outside to this group of objects are removed, Python garbage-collect them. Python can detect cross-referenced objects which are not referenced from the outside
Alexander
For db_session(strict=True) all relationships between objects are removed, so Python can garbage-collect objects more efficiently
Lucky
I'm still not entirerly sure we can't use that process and just shove them into an additonal weak list to recycle the existing objects?
Alexander
The main point of db_session is to denote the moment when the objects can be garbage-collected when no outside references to them exists
Alexander
Especially in multi-threaded applications
Lucky
No it doesn't
Lucky
that is true
Anonymous
Hello https://editor.ponyorm.com/user/<username> does not fully load for me, does it work for you?
Anonymous
Failed to load resource: net::ERR_CONNECTION_REFUSED jquery-3.1.0.min.js:1
Alexander
It works for me, maybe some internet connectivity problem
Anonymous
everything else on the internet works, but thanks
Pav
Hi! How install Pony 0.7.7?
stsouko
Pip install git+https://url@orm#pony
stsouko
Url is git clone url
Anzor
Hi guys and also those rare girls! @metaprogrammer not sure who's responsible for that, but I just switched to Basic plan for only $5 per this month. I mean an editor.
Alexander
Hi Anzor! Do I understand correctly that you've switched from Free to Basic plan on editor.ponyorm.com and everything is ok?
stsouko
https://github.com/stsouko/LazyPony
Permalink Bot
https://github.com/stsouko/LazyPony
Permanent link to the stsouko/LazyPony project you mentioned. (?)
Alexander
Wow nice picture
stsouko
this is old avatar. I don't know why it not changed
Александр
Hey. Can you give links to big projects where pony is used? I want to study the code.
Alexander
https://github.com/Tribler/tribler
Permalink Bot
https://github.com/Tribler/tribler
Permanent link to the Tribler/tribler project you mentioned. (?)
Александр
thank!! I will go to study)
stsouko
Hello! how to set old style sql_debug for to see all queries in every threads/functions/methods?
Alexander
set_sql_debug(True)
Alexander
old sql_debug(True) works too, but will be deprecated
Alexander
Oh, you mean, in all threads
stsouko
I wrote mixins for Entity. in this mixins I do many queries. but I can see only top level queries
stsouko
I think generators ignore sql_debug
Alexander
I'll check that today. Right now you can do import pony.orm.core and then pony.orm.core.local.debug = True before queries as a quick fix
stsouko
Thank You!
stsouko
not working to me
stsouko
or before all queries?
Alexander
To be sure, you can put it before each query
Alexander
Are you sure queries are actually executed?
stsouko
To be sure, you can put it before each query
this also not working. Yes. I see results of execution. in debug mode line by line also
Alexander
Maybe you use logging module and it is misconfigured, so it suppress output
stsouko
pure python main.py also not working. only first query showed
Alexander
That looks strange
Alexander
Do you have import logging in main.py or in some module which imports from main.py?
stsouko
only this from logging import warning
Alexander
can you remove it?
Alexander
Actually, it doesn't matter
Alexander
> in debug mode line by line also if you use debugger you can put bereakpoint inside pony.orm.core.log_sql function (it's on core.py:112) and see if it executed
stsouko
yes. it executed
stsouko
for every
stsouko
but not printed
stsouko
if has_handlers(sql_logger): this condition executed
stsouko
level 20 is disabled
Alexander
Ok, so the logging module is configured to silent INFO-level messages
Alexander
You need to change it
Alexander
Or you can direct pony to use another level for its messages
Alexander
import pony.orm.core pony.orm.core.orm_log_level = logging.WARNING But probably it is better to set logging level to INFO