PHP Group By Array Key
I search for this more than I thought I would. As the title suggests, the following allows you to group items of an array by its key. By Tim Cooper
I search for this more than I thought I would. As the title suggests, the following allows you to group items of an array by its key. By Tim Cooper
Redis has a maxmemory-policy where it decides how to handle keys when memory limit is reached. noeviction will return errors when the memory limit is reached. allkeys-lru will remove less recently used keys first. volatile-lru will remove less recently used keys first that have an expiry set. allkeys-random will evict random keys. volatile-random will evict … Read more
If a user make two request in succession and you want to cancel the first, here’s how to do it in Axios. Works like a charm. Provided by NicksonYap, some iteration of iterations. https://github.com/axios/axios/issues/1361#issuecomment-417880626
I was having trouble understanding the work-flow and boilerplate of using Vuex state in my components. This article by Markus Oberlehner helped me understand how to bind elements to the state and how to use vuex-map-fields to simplify the process. Here’s the manual way. Read his post to find out about vuex-map-fields https://markus.oberlehner.net/blog/form-fields-two-way-data-binding-and-vuex/
If you are using Laravel’s firstOrCreate() method and get a unique id violation error, there is a chance you have repopulated your database retroactively in Postgres. Postgres triggers the next id in the sequence on row creation instead of on import. So it thinks the next sequence in the id field is an existing ID … Read more
You got an error. A composer memory limit error. You go to the documentation. It tells you to search through your PHP ini files and increase the memory limit. You realize that’s going to take a while depending on what env you’re using. You’d rather just copy and paste a command in the terminal. This … Read more
This query is to delete duplicate rows on multiple selected columns for Postgres. Just plug in the table_name and columns you want to evaluate duplicates on. This query is the fastest out of several I’ve tried, evaluating and deleting 5 million duplicate rows on 8 columns in about 20 seconds.
This is a baller gist of common useful Postgres queries.
You know you have to query last month’s records all the time (instead of the last 30 days). Here’s how to get those numbers easily with Carbon. We do the extra fancy footwork for the end of the month to avoid simply getting the same last date of the month for the previous month.
If your query is forcing the db to search through millions of rows to fetch data, indexing a table can help improve performance. Here’s how to do it on Postgres. ELI5 Index If you’re actually five you probably haven’t seen one of those indexes in wooden box that you saw at public libraries, but it … Read more