Is there a better way to search for users and computers specifically using the Net-ldap gem?
Here is what I am currently having to do to get only users.
results = search :base => #base, :filter => Net::LDAP::Filter.eq("cn", "*")
#results = Array.new
results.each do |result|
#results.push result if result[:objectclass].include? "person" unless result[:objectclass].include? "computer"
Seems like there would be a better way. I can't see anything obvious in the documentation.
You can use the Join filter functionality of net-ldap:
filter = Net::LDAP::Filter.eq("sAMAccountName", "*")
filter2 = Net::LDAP::Filter.eq("objectCategory", "organizationalPerson")
joined_filter = Net::LDAP::Filter.join(filter, filter2)
ldap.search(:base => treebase, :filter => joined_filter) do |entry|
puts entry.sAMAccountName
end
If you know the objectClass that is used for persons, you could use the filter '(objectClass=person)', replacing 'person' with the objectClass. Most implementations will use 'person' or an objectClass that inherits from 'person' such as 'inetOrgPerson'. Using the filter '(cn=*)' will most likely get entries that are not persons.
Try using Filter.eq("objectClass","person")
Related
Let's say I have an index with multiple objects in it:
class ThingsIndex < Chewy::Index
define_type User do
field :full_name
end
define_type Post do
field :title
end
end
How do I search both users' full_name and posts' titles.
The docs only talk about querying one attribute like this:
ThingsIndex.query(term: {full_name: 'Foo'})
There are a couple ways you could do this. Chaining is probably the easiest:
ThingsIndex.query(term: {full_name: 'Foo'}).query(term: {title: 'Foo'})
If you need to do several queries, you might consider merging them:
query = ThingsIndex.query(term: {full_name: 'Foo'})
query = query.merge(ThingsIndex.query(term: {title: 'Foo'}))
Read more about merging here: Chewy #merge docs
Make sure to set your limit or else it only shows 10 results:
query.limit(50)
I am new to Ruby and ThinkingSphinx.
I have the following Sphinx Query - SELECT * FROM user_core, user_delta WHERE sphinx_deleted = 0.
I do not want to see the condition "WHERE 'sphinx_deleted' = 0. How do I remove this? I have removed the sql_attr_uint = sphinx_deleted from my sphinx.conf file, yet I see the sphinx_deleted being passed in the query.
Here is the index file definition:
ThinkingSphinx::Index.define :user, :with => :active_record, :delta => true do
indexes [first_name,last_name,display_name], :as=>:name, :sortable=>true
indexes first_name, :sortable => true
indexes last_name, :sortable => true
indexes display_name, :sortable => true
indexes email, :sortable => true
indexes phone, :sortable => true
indexes title, :sortable => true
has id, :as => :user_id
has roles(:id), :as => :role_ids
has jurisdictions(:id), :as => :jurisdiction_ids
set_property :delta => true
end
I do not have a sphinx_scope or default_sphinx_scope defined.
We are using thinking-sphinx-3.1.0 and ruby-2.1.0
The sphinx_deleted attribute is created by Thinking Sphinx, and is used in the following cases (using your scenario of a User model with core and delta indices in the examples):
When a User is deleted, sphinx_deleted is set to 1 for that record in both the core and delta indices - there's no point returning Sphinx records if the underlying ActiveRecord object no longer exists.
When a User is updated, the delta index is processed with the latest field and attribute details, and the core index's document has sphinx_deleted set to 1, so only the latest (accurate) information will match. e.g. if a user has their name changed from Fred to Georgina, a search for 'Fred' will not return Georgina, because the core index document (which does match) is filtered out.
That is why the attribute exists. You cannot tell Thinking Sphinx to not add it, nor can you remove that filter, short of mucking around in the internals of Thinking Sphinx.
If there is a specific reason for wanting to remove the attribute and filter, feel free to comment here, or you can open an issue on the GitHub repo, or post to the TS Google Group.
Update
Okay, further to this, there are three ways around it.
Option One:
The first way is to make the query to Sphinx yourself, using a Thinking Sphinx connection:
results = ThinkingSphinx::Connection.take do |connection|
connection.execute "SELECT * FROM user_core, user_delta"
end
Keep in mind that this returns raw Sphinx values, not ActiveRecord instances.
Option Two:
A more complicated alternative, though, is to have your own search middleware stack. First, you'll want to create a custom subclass of ThinkingSphinx::Middlewares::SphinxQL that removes the :sphinx_deleted filter:
class SphinxQLWithoutFilter < ThinkingSphinx::Middlewares::SphinxQL
def call(contexts)
contexts.each do |context|
Inner.new(context).call
end
app.call contexts
end
private
class Inner < ThinkingSphinx::Middlewares::SphinxQL::Inner
def inclusive_filters
super.except :sphinx_deleted
end
end
end
Then, create a new middleware stack which uses this new SphinxQL query middleware:
WithoutFilterMiddleware = ::Middleware::Builder.new do
use ThinkingSphinx::Middlewares::StaleIdFilter
use SphinxQLWithoutFilter
use ThinkingSphinx::Middlewares::Geographer
use ThinkingSphinx::Middlewares::Inquirer
use ThinkingSphinx::Middlewares::ActiveRecordTranslator
use ThinkingSphinx::Middlewares::StaleIdChecker
use ThinkingSphinx::Middlewares::Glazier
end
And then you can use that middleware stack in specific search queries:
User.search 'foo', :middleware => WithoutFilterMiddleware
It's worth noting the two middleware present in that stack for stale ids. They work together to catch any Sphinx results that do not have a matching ActiveRecord object, and re-run the Sphinx query up to three times filtering out those unmatched records. They're probably useful, but if you don't want to use them, you can remove them from your custom stack. However, without them, any Sphinx records that don't have matching ActiveRecord objects will be transformed into nils.
Option Three:
This is the more hackish version of the previous solution, but will apply to all searches, so probably isn't worthwhile: re-open the class that adds the filter with class_eval and change the method definition:
ThinkingSphinx::Middlewares::SphinxQL::Inner.class_eval do
def inclusive_filters
# normally:
# (options[:with] || {}).merge({:sphinx_deleted => false})
# but without the sphinx_deleted filter:
options[:with] || {}
end
end
Now, all that said: I presume you're not actually deleting users, but somehow the deletion callbacks are being fired anyway? Hence, users do exist but are currently being filtered out by Sphinx? If so, I highly recommend not using ActiveRecord's destroy method, and instead having a custom method to mark users as inactive. This avoids the callbacks, and thus avoids the need for any of the above 'solutions'.
I am using Sequel with prostgres and Sinatra. I want to do an autocomplete search. I’ve verified my jQuery which sends a GET works fine.
The Ruby code is:
get '/search' do
search = params[:search]
DB[:candidates].select(:last).where('last LIKE ?', '_a_').each do |row|
l = row[:last]
end
end
The problem is the Sequel query:
I have tried every possible configuration of the query that I can think of with no luck.
So, for example, in the above query I get all the people who have "a" in their last name but when I change the query to:
DB[:candidates].select(:last).where('last LIKE ?', 'search')
or
DB[:candidates].select(:last).where('last LIKE ?', search) # (without '')
I get nothing.
I have done warn params.inspect which indicates the param search is being passed, so I am stuck.
Any ideas how the query should be written?
Finally, the second part of the question the results (when it works with '_a_') are rendered as {:last=>"Yao"} I would like just Yao, how can I do that?
I have tried numerous different types of query including raw SQL but no luck. Or is the approach just plain wrong?
Just installed Sequel and made working example:
require "rubygems"
require "sequel"
# connect to an in-memory database
DB = Sequel.sqlite
# create an items table
DB.create_table :items do
primary_key :id
String :name
Float :price
end
# create a dataset from the items table
items = DB[:items]
# populate the table
items.insert(:name => 'abc', :price => rand * 100)
items.insert(:name => 'def', :price => rand * 100)
items.insert(:name => 'ghi', :price => rand * 100)
items.insert(:name => 'gui', :price => rand * 100)
# print out the number of records
puts "Item count: #{items.count}"
# print out the average price
puts "The average price is: #{items.avg(:price)}"
recs = items.select(:name).where(Sequel.like(:name, 'g%'))
recs.each do |rec|
puts rec.values
end
I think you will get the point.
UPDATED
So in your case you should try this:
DB[:candidates]
.select(:last)
.where(Sequel.like(:last, "#{search}%"))
.map{|rec| rec.values}.flatten
It should return array of found strings.
Copy/pasting from the Sequel documentation:
You can search SQL strings in a case sensitive manner using the Sequel.like method:
items.where(Sequel.like(:name, 'Acme%')).sql
#=> "SELECT * FROM items WHERE (name LIKE 'Acme%')"
You can search SQL strings in a case insensitive manner using the Sequel.ilike method:
items.where(Sequel.ilike(:name, 'Acme%')).sql
#=> "SELECT * FROM items WHERE (name ILIKE 'Acme%')"
You can specify a Regexp as a like argument, but this will probably only work on PostgreSQL and MySQL:
items.where(Sequel.like(:name, /Acme.*/)).sql
#=> "SELECT * FROM items WHERE (name ~ 'Acme.*')"
Like can also take more than one argument:
items.where(Sequel.like(:name, 'Acme%', /Beta.*/)).sql
#=> "SELECT * FROM items WHERE ((name LIKE 'Acme%') OR (name ~ 'Beta.*'))"
Open up a Sequel console (not your Sinatra app) and play with the query until you get results back. Since you say you want only the last column your query should be something like:
# Search anywhere inside the last name
DB[:candidates].where( Sequel.ilike(:last, "%#{search}%") ).select_map(:last)
# Find last names starting with the search string
DB[:candidates].where( Sequel.ilike(:last, "#{search}%") ).select_map(:last)
Uglier alternatives:
DB[:candidates]
.select(:last)
.where( Sequel.ilike(:last, "%#{search}%") )
.all
.map{ |hash| hash[:last] }
DB[:candidates]
.select(:last)
.where( Sequel.ilike(:last, "%#{search}%") )
.map( :last )
If you want to rank the search results by the best matches, you might be interested in my free LiqrrdMetal library. Instead of searching on the DB, you would pull a full list of all last names into Ruby and use LiqrrdMetal to search through them. This would allow a search string of "pho" to match both "Phong" as well as "Phrogz", with the former scoring higher in the rankings.
I think this is a simple problem, but i just cant get my head round the solution. If i have a collection of records (reports in this case):
#reports = Report.all, :conditions => ["score > 10"]
and then i try to find the associated collection of other type of records (users in this case) i, naively, try this - but know from the offset that it wont work:
#users = User.find :all, :conditions => ["id IN (?)", #results.user_id]
So, how do i efficiently extract the #users collection of records?
Assuming that User has_many :reports
#users = User.joins(:reports) # all users that have reports
If you only want all users for some specific reports
#users = User.joins(:reports).where("reports.id IN (?)", #reports.map(&:id))
What is a good method in Ruby to prevent SQL Injection?
in straight up ruby? use prepared statements:
require 'mysql'
db = Mysql.new('localhost', 'user', 'password', 'database')
statement = db.prepare "SELECT * FROM table WHERE field = ?"
statement.execute 'value'
statement.fetch
statement.close
Not just in Ruby - bind your parameters (be it in the database, or in your client code).
Check out the guide they have up on this: http://guides.rubyonrails.org/security.html#injection
Basically, you want to use bind variables in your models to find data, rather than inline parameters..
Model.find(:first, :conditions => ["login = ? AND password = ?", entered_user_name, entered_password])
According to http://ruby.railstutorial.org/ you can prevent Cross Site Request Forgery by inserting the
<%= csrf_meta_tags %>
tag in the header of app/views/layouts/application.html.erb.
Direct link of example
This thread references:
http://www.ruby-forum.com/topic/90258#new
http://www.ruby-forum.com/topic/82349#143790
ActiveRecord's find() method has built in ways to avoid SQL injection by
using the format
> :conditions => [ "user_name = ?", user_name]
Is there any such system for escaping injection in order? It seems to
only take a string and feed it to the SQL statement. This causes a
vulnerability when using params to set : order as in this function:
def list
sort_by = params[:sort_by]
#book_pages, #books = paginate :books,
:order => sort_by,
:per_page => 10
end
We've tried a few methods to sanitize sort_by such as order => ['?',
sort_by] but it just passes that to the SQL statement like a flattened
array. The 'escaping magic' does not work for order. Should I use
gsub! to sanitize params[:sort_by]?