US court fully legalized website scraping and technically prohibited it
Coverage of decision which took part of the LinkedIn vs hiQ Labs. The appeals court also upheld a lower court ruling that prohibits LinkedIn from interfering with hiQ’s web scraping of its site. This fundamentally changes the balance of power in dealing with such cases in the future.
The dark side of .io: How the U.K. is making web domain profits from a shady Cold War land deal
Came to this (older) piece via recent decision by
draw.ioto rename to
diagrams.neta they are not happy with the sate of
Super short article about stepping out from
Ecto.Schemacloser to raw SQL with
Ecto. Neath trick is to use
Ectowill return (list of)
Mapinstead of a list of lists.
Cheap tricks for high-performance Rust
Pascal shares some of the simple tricks to speed up your Rust programs without really changing the source. Hints like properly setting your target architecture, alternative allocator, release profiles and more.
When Bloom filters don’t bloom
Marek needed to deduplicate large list of IP addresses, so he set sail on the journey of getting better then
sort | unique. He shares some lessons learned about random memory access latency, power of cache friendly data structures and Bloom Filters and finally “just” hash table.
Moving a method from struct impl to trait causes performance degradation
On very similar note as a previous one – code alignment having a significant impact on the performance.
Starter project for Flutter plugins willing to access native and synchronous rust code using FFI
Flutter meets Rust, wow.
I never really give a recursion though in the context of SQL…
pgtuneon web. Simple site to give you
psqlconfiguration to start with for different use-cases and server configuration.
List of various papers related to BERT (Bidirectional Encoder Representations from Transformers). BERT was released by Google as part of their NLP research. But researchers are stepping forward and you can find multi-modal applications as well.
Exploring Transfer Learning with T5: the Text-To-Text Transfer Transformer
New publication by Google at the NLP space. They have published a new model called Text-To-Text Transfer Transformer (T5). They have also open-sourced a new pre-training dataset, called the Colossal Clean Crawled Corpus (C4).
Transformers are Graph Neural Networks
Drawing parallels between Transformers (key component of BERT) and Graph Networks.