Tag: performance

Ramblings on optimizations, anti patterns and N+1

A lot of people ask me to teach them how to do query analysis and performance. The truth is: there isn’t a script to follow. The following paragraphs are a brain dump on what usually goes on my mind when I am debugging and analyzing.

Please comment on what you think I should focus on to cover here.

TL; DR;

  • It’s just a messy post with database-y stuff
  • This post doesn’t have a conclusion, it is just me laying down my thoughts on performance and optimizations.

Thoughts

Query performance is a really difficult subject to talk about. Mostly because because SQL is a declarative language, leaving it up to the Optimizer to decide which way is the best to retrieve the information needed and that is based in so many variables.

The most common problem regarding optimization I see, comes not from the Database itself, but how we handle the requests on the application layer, the following for instance would cause N+1 problems:

Code example:


users = User.all
users.each do |user|
puts "Name #{user.name}"
puts "Addresses: "
user.addresses.each do |address|
puts address.street
puts "#{address.city}#{address.state}"
end
end

view raw

n_plus_one.rb

hosted with ❤ by GitHub

Although seemingly innocent at first, this code could easily slow down performance on the database due to the amount of requests that would be made.

You also need to know about the intricacies of indexes, which one is the best, if you have a composite index, which should go first, and what happens if I only use one of the fields of a two column indexes in my search? Does it still uses the index somehow? Another rule of thumb is that if an index is a BTREE, on a single column, you can use it either ASC​ or DESC.

Or better yet: why my transactions are taking so long to complete? Does it have too many indexes on the table? Is any other query locking table X?

Even a single ​INNER JOIN could be highly costly if joining two large tables.

Why are you saving that JSON in a TEXT​ field? Since we are on the subject, you really need the JSON in the relational database and not in a document store?

You don’t need to port all your data from PostgreSQL/MySQL to MongoDB if you want to have MongoDB on your stack. Everything has its place, relational data on relational databases and non-relational data on non relational databases. I even find unfair benchmarks between a SQL database and a NoSQL one. They were made to solve different problems, you can’t possibly have the same use case for both of them.

No, it’s not ok to have category_1, category_2, ..., category_n as columns on your products table.

Avoid as much as possible nullable fields.

Relationships should also explicitly live on the RDBMS, not only on your model, if you have a user_id​ on your addresses​ table, tell the database so, naming it user_id doesn’t automatically create the foreign key.

You need:


ALTER TABLE addresses ADD CONSTRAINT `fk_user_id` FOREIGN KEY (`user_id`) REFERENCES users (`id`);

Or your migration should look something like this:


class CreateUsers < ActiveRecord::Migration[5.1]
def change
create_table :users do |t|
t.string :name
t.timestamps
end
end
end
class CreateAddresses < ActiveRecord::Migration[5.1]
def change
create_table :addresses do |t|
t.text :street
t.string :city
t.string :state
t.string :zipcode
t.integer :user_id
t.timestamps
end
add_foreign_key :addresses, :users
end
end

Line 24: adds to the table addresses​ a foreign key from users.

End

And you, what you think is missing in this blogpost? What do you want to get deeper on?