Errors in sdl install and use in perl on windows - windows

i have tried:
in menu start: CPAN Client
install Alien::SDL SDL
i choose 64 bit and after a long while errors:
what is wrong? it seems simple, but it's not
so i tried: cpan -i --force SDL
no errors, but i can't run Padre anymore :/
i try perl name.pl (with sdl code program) and it gives:
in 'cmd' in Windows 7: cpan -i SDL and it gives me (last part):
<pre>
t\sdlx_controller_interface.t ... ok
t\sdlx_fps.t .................... ok
t\sdlx_layermanager.t ........... ok
t\sdlx_music.t .................. ok
t\sdlx_rect.t ................... ok
t\sdlx_sfont.t .................. ok
t\sdlx_sound.t .................. ok
t\sdlx_sprite.t ................. ok
t\sdlx_sprite_animated.t ........ ok
t\sdlx_surface.t ................ ok
t\sdlx_text.t ................... ok
t\sdlx_validate.t ............... ok
t\smpeg.t ....................... skipped: smpeg support not compiled
t\ttf.t ......................... ok
t\ttf_font.t .................... ok
Test Summary Report
-------------------
t\core.t (Wstat: 0 Tests: 28 Failed: 0)
TODO passed: 21-22
t\core_video.t (Wstat: 768 Tests: 71 Failed: 0)
TODO passed: 57, 59
Non-zero exit status: 3
Parse errors: No plan found in TAP output
Files=59, Tests=3788, 188 wallclock secs ( 0.56 usr + 0.08 sys = 0.64 CPU)
Result: FAIL
Failed 1/59 test programs. 0/3788 subtests failed.
FROGGS/SDL-2.546.tar.gz
E:\_win_7\Dwimperl\perl\bin\perl.exe ./Build test -- NOT OK
//hint// to see the cpan-testers results for installing this module, try:
reports FROGGS/SDL-2.546.tar.gz
Running Build install
make test had returned bad status, won't install without force
So... i have tried "cpan -i SDL --force"
gives:
Test Summary Report
-------------------
t\core.t (Wstat: 0 Tests: 28 Failed: 0)
TODO passed: 21-22
t\core_video.t (Wstat: 768 Tests: 71 Failed: 0)
TODO passed: 57, 59
Non-zero exit status: 3
Parse errors: No plan found in TAP output
Files=59, Tests=3788, 147 wallclock secs ( 0.47 usr + 0.13 sys = 0.59 CPU)
Result: FAIL
Failed 1/59 test programs. 0/3788 subtests failed.
FROGGS/SDL-2.546.tar.gz
E:\_win_7\Dwimperl\perl\bin\perl.exe ./Build test -- NOT OK
//hint// to see the cpan-testers results for installing this module, try:
reports FROGGS/SDL-2.546.tar.gz
Running Build install
make test had returned bad status, won't install without force
Warning: Cannot install --force, don't know what it is.
Try the command
i /--force/
to find objects with matching identifiers.

So I tried:
cpan -i --force SDL
No errors, but I can't run Padre anymore. I tried
perl name.pl
(with sdl code program) and it gives:

Related

strange behavior of cmake ctest for bigger CTEST_PARALLEL_LEVEL

I am new to the SO.
I have a simple unit test code where I am doing following operations :
calculating square root of the number using mysqrt library .
Using the output of square root, adding this result with same number and display the result.
When I am running the code with CTEST_PARALLEL_LEVEL = 1 my all test cases are passing.
But when I do CTEST_PARALLEL_LEVEL = 8 then my test cases are failing some time for some input which is not fixed in every run.
99% the ALL results are passing but 1% its failing.
Error:
mysqrt.o: file not recognized: File truncated
I have deleted the object file explicitly by using rm *.o ,But still this error is coming after few runs .
I am not sure why this error is coming with CTEST_PARALLEL_LEVEL = 8
I am attaching my CMakeList only as some of Stack Overflow expert can understand the issue by checking these 3 CMakeLists.txt files.
NOTE: As per the Stack overflow guidelines I am not attaching my source code of sqrt and addition function to avoid the bigger length of the question .
My folder structure:
SAMPLE_TEST
├── CMakeLists.txt
├── MathFunctions
│   ├── CMakeLists.txt
│   ├── MathFunctions.h
│   └── mysqrt.cpp
└── unit_test
├── CMakeLists.txt
└── step2
├── CMakeLists.txt
├── execute.cpp
└── tutorial.cpp
SAMPLE_TEST
CMakeLists.txt
cmake_minimum_required(VERSION 3.1)
project(Tutorial)
ENABLE_TESTING()
add_subdirectory(MathFunctions)
add_subdirectory(unit_test)
MathFunctions folder
CMakeLists.txt
add_library(MathFunctions mysqrt.cpp)
set(REF_FILES mysqrt.cpp)
add_definitions(-Wall -Wextra -pedantic -std=c++11)
add_custom_target(build_reference_library
DEPENDS sqrtlib
COMMENT "Generating sqrtlib")
ADD_LIBRARY(sqrtlib OBJECT ${REF_FILES})
unit_test folder
CMakeLists.txt
set(REF_MATHLIB_DIR ${CMAKE_CURRENT_SOURCE_DIR}/../MathFunctions)
macro(GENERATION file input)
set(ip_generator ctest_input_${input})
add_executable(${ip_generator}
${file}
$<TARGET_OBJECTS:sqrtlib>
)
target_compile_options(${ip_generator} PUBLIC
-Wall -Wextra -g -std=c++11
-DCTEST_INPUT=${input})
target_link_libraries(${ip_generator} PUBLIC
dl pthread
)
target_include_directories(${ip_generator} PUBLIC
${REF_MATHLIB_DIR}
)
set(INPUT_FILE0 ip0_${input}.y)
set(INPUT_FILE0_TXT ip0_${input}.txt)
add_custom_command(
OUTPUT ${INPUT_FILE0} ${INPUT_FILE0_TXT}
COMMAND ${ip_generator} > ${INPUT_FILE0_TXT}
MAIN_DEPENDENCY ${sqrtlib}
COMMENT "Generating output files of for testcase")
add_custom_target(gen_input_${input}
DEPENDS ${INPUT_FILE0}
COMMENT "Generated output files")
endmacro()
####################
macro(EXECUTE file input)
get_filename_component(main_base_name ${file} NAME_WE)
set(main_base_name_mangled ${main_base_name}_${input})
set(exe_generator ctest_ref_${input})
add_executable(${exe_generator}
${file}
$<TARGET_OBJECTS:sqrtlib>
)
target_compile_options(${exe_generator} PUBLIC
-Wall -Wextra -g -std=c++11
-DCTEST_INPUT=${input})
target_link_libraries(${exe_generator} PUBLIC
dl pthread
)
target_include_directories(${exe_generator} PUBLIC
${REF_MATHLIB_DIR}
)
set(INPUT_FILE0 ip0_${input}.y)
set(EXE_FILE0 exeadd_${input}.y)
set(EXE_FILE_TXT exeadd_${input}.txt)
add_custom_command(
OUTPUT ${EXE_FILE0} ${EXE_FILE_TXT}
COMMAND ${exe_generator} > ${EXE_FILE_TXT}
MAIN_DEPENDENCY ${INPUT_FILE0} ${sqrtlib}
COMMENT "Generating output files of for testcase")
add_custom_target(gen_execute_${input}
DEPENDS ${EXE_FILE0}
COMMENT "Generated output files")
# add test to simulate
add_test(NAME ctest_execute_${input}
COMMAND ${CMAKE_COMMAND} --build ${CMAKE_BINARY_DIR}
--target gen_execute_${input})
#add_dependencies(execute_${main_base_name_mangled}
#gen_input)
endmacro()
#++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++#
# add test directories
#++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++#
set(TEST_DIRECTORIES
step2
)
foreach(dir ${TEST_DIRECTORIES})
add_subdirectory(${dir})
endforeach()
step2 folder
CMakeLists.txt
set(UT_IPGEN_FILES tutorial.cpp)
set(UT_EXECUTE_FILES execute.cpp)
set(input_integer_range 1 4 9 16 25 36 49 64 81 100 121 144 )
foreach(ip_integer ${input_integer_range})
GENERATION(${UT_IPGEN_FILES} ${ip_integer})
EXECUTE(${UT_EXECUTE_FILES} ${ip_integer})
endforeach(ip_integer)
Result:
1st Run:
Start 1: ctest_execute_1
Start 2: ctest_execute_4
Start 3: ctest_execute_9
Start 4: ctest_execute_16
Start 5: ctest_execute_25
Start 6: ctest_execute_36
Start 7: ctest_execute_49
Start 8: ctest_execute_64
1/12 Test #4: ctest_execute_16 .................***Failed 1.14 sec
2/12 Test #6: ctest_execute_36 ................. Passed 1.27 sec
3/12 Test #7: ctest_execute_49 ................. Passed 1.32 sec
4/12 Test #8: ctest_execute_64 ................. Passed 1.32 sec
Start 9: ctest_execute_81
Start 10: ctest_execute_100
Start 11: ctest_execute_121
Start 12: ctest_execute_144
5/12 Test #1: ctest_execute_1 .................. Passed 1.33 sec
6/12 Test #2: ctest_execute_4 .................. Passed 1.33 sec
7/12 Test #3: ctest_execute_9 .................. Passed 1.33 sec
8/12 Test #5: ctest_execute_25 ................. Passed 1.33 sec
9/12 Test #10: ctest_execute_100 ................ Passed 0.54 sec
10/12 Test #11: ctest_execute_121 ................ Passed 0.55 sec
11/12 Test #9: ctest_execute_81 ................. Passed 0.55 sec
12/12 Test #12: ctest_execute_144 ................ Passed 0.55 sec
92% tests passed, 1 tests failed out of 12
Total Test time (real) = 1.88 sec
The following tests FAILED:
4 - ctest_execute_16 (Failed)
2nd Run:
Start 1: ctest_execute_1
Start 2: ctest_execute_4
Start 3: ctest_execute_9
Start 4: ctest_execute_16
Start 5: ctest_execute_25
Start 6: ctest_execute_36
Start 7: ctest_execute_49
Start 8: ctest_execute_64
1/12 Test #6: ctest_execute_36 ................. Passed 1.31 sec
2/12 Test #7: ctest_execute_49 ................. Passed 1.36 sec
3/12 Test #8: ctest_execute_64 ................. Passed 1.36 sec
Start 9: ctest_execute_81
Start 10: ctest_execute_100
Start 11: ctest_execute_121
4/12 Test #1: ctest_execute_1 .................. Passed 1.37 sec
5/12 Test #2: ctest_execute_4 .................. Passed 1.37 sec
6/12 Test #3: ctest_execute_9 .................. Passed 1.36 sec
7/12 Test #4: ctest_execute_16 ................. Passed 1.36 sec
8/12 Test #5: ctest_execute_25 ................. Passed 1.37 sec
Start 12: ctest_execute_144
9/12 Test #11: ctest_execute_121 ................ Passed 0.50 sec
10/12 Test #10: ctest_execute_100 ................ Passed 0.51 sec
11/12 Test #9: ctest_execute_81 ................. Passed 0.51 sec
12/12 Test #12: ctest_execute_144 ................ Passed 0.34 sec
100% tests passed, 0 tests failed out of 12
Total Test time (real) = 2.01 sec
Your tests executing
COMMAND ${CMAKE_COMMAND} --build ${CMAKE_BINARY_DIR} --target ...
which effectively runs make (or whatever build tool you use) in the project's build directory.
But concurrent invocations of make in the same directory are never guarantee to work correctly. This is why you got weird errors when run tests in parallel (with CTEST_PARALLEL_LEVEL variable being set).
E.g. all these tests are trying to create the same object file mysqrt.o, and this creation is definitely not thread-safe.
By running
make sqrtlib
before
ctest
you may be sure that the object file is already created when tests are run, and tests wouldn't attempt to create it again.
But you still could get other conflicts in the parallel tests.
It depends on what actually you want to check by the testing, but usually a test checks behavior of some program or library, and it doesn't intend to check a compilation(building) of that program. Because of that, compilation(building) commands are performed before the testing.
Usually it is convenient to follow(implement) this workflow for testing:
# Configure the project
cmake <source-directory>
# Build the project.
# It builds both program/library intended, and the tests themselves.
make
# run tests
ctest <params>
In that case a test could have the following definition:
add_test(NAME ctest_execute_${input} COMMAND ${exe_generator})
(Unless you want to check output of the test by some automatic way, no need to explicitely save this output by redirecting into the file. ctest by itself would collect the output of the test, so you may read it if needed).

bitbake rpi-test-image with MACHINE=raspberrypi3-64 failing to build on the zeus release of yocto

I'm following the readthedocs instructions for the meta-raspberrypi layer and trying to build the rpi-test-image image for the raspberrypi3-64 machine contained in the meta-raspberrypi layer using the zeus release of yocto.
I added this to conf/local.conf:
MACHINE ??= "raspberrypi3-64"
ENABLE_UART = "1"
My bblayers.conf file looks like this:
/opt/yocto/workspace/rpi3-64-build/conf[master]☢ ☠$ cat bblayers.conf
# POKY_BBLAYERS_CONF_VERSION is increased each time build/conf/bblayers.conf
# changes incompatibly
POKY_BBLAYERS_CONF_VERSION = "2"
BBPATH = "${TOPDIR}"
BBFILES ?= ""
BBLAYERS ?= " \
/opt/yocto/workspace/sources/poky/meta \
/opt/yocto/workspace/sources/poky/meta-poky \
/opt/yocto/workspace/sources/poky/meta-yocto-bsp \
/opt/yocto/workspace/sources/meta-openembedded/meta-oe \
/opt/yocto/workspace/sources/meta-openembedded/meta-multimedia \
/opt/yocto/workspace/sources/meta-openembedded/meta-networking \
/opt/yocto/workspace/sources/meta-openembedded/meta-python \
/opt/yocto/workspace/sources/meta-raspberrypi \
"
I added the 4 items from meta-openembeeded as a work-around for ERROR: Nothing RPROVIDES 'bigbuckbunny-480p':
This enables the build to run but it exits with the following errors:
/opt/yocto/workspace/rpi3-64-build/conf[master]☢ ☠$ bitbake rpi-test-image
Parsing recipes: 100% |#########################################################################################################################################| Time: 0:02:49
Parsing of 2340 .bb files complete (0 cached, 2340 parsed). 3464 targets, 133 skipped, 0 masked, 0 errors.
NOTE: Resolving any missing task queue dependencies
Build Configuration:
BB_VERSION = "1.44.0"
BUILD_SYS = "x86_64-linux"
NATIVELSBSTRING = "ubuntu-16.04"
TARGET_SYS = "aarch64-poky-linux"
MACHINE = "raspberrypi3-64"
DISTRO = "poky"
DISTRO_VERSION = "3.0.2"
TUNE_FEATURES = "aarch64 cortexa53 crc"
TARGET_FPU = ""
meta
meta-poky
meta-yocto-bsp = "zeus:74f229160c7f4037107c1dad8f0d02128c080a7e"
meta-oe
meta-multimedia
meta-networking
meta-python = "zeus:9e60d30669a2ad0598e9abf0cd15ee06b523986b"
meta-raspberrypi = "zeus:0e05098853eea77032bff9cf81955679edd2f35d"
Initialising tasks: 100% |######################################################################################################################################| Time: 0:00:06
Sstate summary: Wanted 1338 Found 0 Missed 1338 Current 0 (0% match, 0% complete)
NOTE: Executing Tasks
NOTE: Setscene tasks completed
WARNING: icu-native-64.2-r0 do_fetch: Checksum mismatch for local file /opt/yocto/cache/downloads/icu4c-64_2-src.tgz
Cleaning and trying again.
WARNING: icu-native-64.2-r0 do_fetch: Renaming /opt/yocto/cache/downloads/icu4c-64_2-src.tgz to /opt/yocto/cache/downloads/icu4c-64_2-src.tgz_bad-checksum_abb12cb25a05198ad8f4c1e6f668fa05
WARNING: icu-native-64.2-r0 do_fetch: Checksum failure encountered with download of http://download.icu-project.org/files/icu4c/64.2/icu4c-64_2-src.tgz - will attempt other sources if available
ERROR: rpi-test-image-1.0-r0 do_rootfs: Could not invoke dnf. Command '/opt/yocto/workspace/rpi3-64-build/tmp/work/raspberrypi3_64-poky-linux/rpi-test-image/1.0-r0/recipe-sysroot-native/usr/bin/dnf -v --rpmverbosity=info -y -c /opt/yocto/workspace/rpi3-64-build/tmp/work/raspberrypi3_64-poky-linux/rpi-test-image/1.0-r0/rootfs/etc/dnf/dnf.conf --setopt=reposdir=/opt/yocto/workspace/rpi3-64-build/tmp/work/raspberrypi3_64-poky-linux/rpi-test-image/1.0-r0/rootfs/etc/yum.repos.d --installroot=/opt/yocto/workspace/rpi3-64-build/tmp/work/raspberrypi3_64-poky-linux/rpi-test-image/1.0-r0/rootfs --setopt=logdir=/opt/yocto/workspace/rpi3-64-build/tmp/work/raspberrypi3_64-poky-linux/rpi-test-image/1.0-r0/temp --repofrompath=oe-repo,/opt/yocto/workspace/rpi3-64-build/tmp/work/raspberrypi3_64-poky-linux/rpi-test-image/1.0-r0/oe-rootfs-repo --nogpgcheck install psplash-raspberrypi packagegroup-core-boot packagegroup-base-extended run-postinsts packagegroup-rpi-test locale-base-en-us locale-base-en-gb' returned 1:
DNF version: 4.2.2
cachedir: /opt/yocto/workspace/rpi3-64-build/tmp/work/raspberrypi3_64-poky-linux/rpi-test-image/1.0-r0/rootfs/var/cache/dnf
Added oe-repo repo from /opt/yocto/workspace/rpi3-64-build/tmp/work/raspberrypi3_64-poky-linux/rpi-test-image/1.0-r0/oe-rootfs-repo
repo: using cache for: oe-repo
not found other for:
not found modules for:
not found deltainfo for:
not found updateinfo for:
oe-repo: using metadata from Thu 30 Apr 2020 09:25:08 PM UTC.
Last metadata expiration check: 0:00:02 ago on Thu 30 Apr 2020 09:25:11 PM UTC.
No module defaults found
--> Starting dependency resolution
--> Finished dependency resolution
Error:
Problem: package packagegroup-base-wifi-1.0-r83.raspberrypi3_64 requires wireless-regdb-static, but none of the providers can be installed
- package wireless-regdb-2019.06.03-r0.noarch conflicts with wireless-regdb-static provided by wireless-regdb-static-2019.06.03-r0.noarch
- package packagegroup-base-1.0-r83.raspberrypi3_64 requires packagegroup-base-wifi, but none of the providers can be installed
- package packagegroup-rpi-test-1.0-r0.noarch requires wireless-regdb, but none of the providers can be installed
- package packagegroup-base-extended-1.0-r83.raspberrypi3_64 requires packagegroup-base, but none of the providers can be installed
- conflicting requests
(try to add '--allowerasing' to command line to replace conflicting packages or '--skip-broken' to skip uninstallable packages or '--nobest' to use not only best candidate packages)
ERROR: Logfile of failure stored in: /opt/yocto/workspace/rpi3-64-build/tmp/work/raspberrypi3_64-poky-linux/rpi-test-image/1.0-r0/temp/log.do_rootfs.23419
ERROR: Task (/opt/yocto/workspace/sources/meta-raspberrypi/recipes-core/images/rpi-test-image.bb:do_rootfs) failed with exit code '1'
NOTE: Tasks Summary: Attempted 3511 tasks of which 1 didn't need to be rerun and 1 failed.
Summary: 1 task failed:
/opt/yocto/workspace/sources/meta-raspberrypi/recipes-core/images/rpi-test-image.bb:do_rootfs
Summary: There were 3 WARNING messages shown.
Summary: There was 1 ERROR message shown, returning a non-zero exit code.
/opt/yocto/workspace/rpi3-64-build/conf[master]☢ ☠$
I'm not even sure how to go about debugging it.
I've tried googling the warnings and error messages but was not able to find anything the might be causing the problem.
2 questions:
1: What might the problem be in this particular instance?
2: What is a good general procedure for debugging failed builds like this?
UPDATE: "bitbake core-image-base" does successfully build the image, but it is still unclear to me why "bitbake rpi-test-image" failed.

install oci8 on php7.3

I am trying to compile from source on centos 7.5, oci8-2.2 but it is giving me a huge failed summary report.
=====================================================================
TEST RESULT SUMMARY
---------------------------------------------------------------------
Exts skipped : 0
Exts tested : 33
---------------------------------------------------------------------
Number of tests : 361 354
Tests skipped : 7 ( 1.9%) --------
Tests warned : 0 ( 0.0%) ( 0.0%)
Tests failed : 342 ( 94.7%) ( 96.6%)
Expected fail : 0 ( 0.0%) ( 0.0%)
Tests passed : 12 ( 3.3%) ( 3.4%)
---------------------------------------------------------------------
Time taken : 29 seconds
=====================================================================
I am not sure if this is normal or if this is meant to show liek this and when i run php locally it does not show oci8 as loaded php -S 0.0.0.0:33080 -t /var/httpd/domain/domain.com/docs -c /opt/SP/php/etc/php.ini
Set valid database connection credentials in oci8/tests/details.inc before you run the tests.
Make sure you have the Oracle Client libraries in your search path. If you are using Oracle Instant Client then using ldconfig would be easiest.

How can I debug failed tests in guile source installation?

I'm building GNU guile-2.0.0 from its source code on an old RHEL6.6 machine. It seems to be built successfully, but I got the following error when I run make check. As I'm very new to scheme and guile, I have no idea how to find any reason for this fail.
How can I debug this? Where can I find any clue to this?
....
Running ports.test
Running posix.test
FAIL: posix.test: utime: valid argument (second resolution)
Running print.test
Running procprop.test
....
Totals for this test run:
passes: 34320
failures: 1
unexpected passes: 0
expected failures: 31
unresolved test cases: 15
untested test cases: 5
unsupported test cases: 9
errors: 0
FAIL: check-guile

Maven+TestNG testing through Jenkins on Ubuntu

I have built a maven+testng testing framework project on eclipse. I installed Jenkins on my local host and ran my project in there till now without any problem. My OS is Windows 10. I also ran my project on different Windows OS machines without any problem.
However I wanted cross platform testing, so I tried running my project in an Ubuntu machine. I did all the setup processes. I configured Firefox binary path and also installed X server, i.e xvfb on that Ubuntu machine.
I did some changes in my webdriver java file
package com.mednet.webdriver;
import java.io.File;
import org.openqa.selenium.WebDriver;
import org.openqa.selenium.firefox.FirefoxBinary;
import org.openqa.selenium.firefox.FirefoxDriver;
public class WebdriverSetUp {
public static WebDriver driver = null;
public static WebDriver startdriver() {
FirefoxBinary binary = new FirefoxBinary(new File("/opt/firefox/firefox"));
binary.setEnvironmentProperty("DISPLAY", System.getProperty("lmportal.xvfb.id", ":99"));
return driver = new FirefoxDriver(binary, null);
}
}
Then I used following commands in terminal of Ubuntu to install firefox and xvfb.
to install firefox: sudo apt-get install firefox
to install xvfb: sudo apt-get xvfb & export DISPLAY=:99 & xvfb 99 -ac
Here I used DISPLAY=:99
Now when I start to build in Jenkins.
Following is my console output
Results :
Failed tests: testInvestigation(com.mednet.executor.diagnostics.ExecTest1): Timed out after 50 seconds waiting for visibility of element located by By.id: lineItemForCounterBilling(..)
testInvestigation(com.mednet.executor.diagnostics.ExecTest6): Timed out after 50 seconds waiting for visibility of element located by By.id: lineItemForCounterBilling(..)
Tests run: 2, Failures: 2, Errors: 0, Skipped: 0
[ERROR] There are test failures.
Please refer to/var/lib/jenkins/workspace/MednetTestingFramework/target/surefire-reports for the individual test results.
[JENKINS] Recording test results
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:19 min
[INFO] Finished at: 2016-08-25T11:06:08+05:30
[INFO] Final Memory: 34M/381M
[INFO] ------------------------------------------------------------------------
[JENKINS] Archiving /var/lib/jenkins/workspace/MednetTestingFramework/pom.xml to com.mednet/MednetTestingFrameworkV1/0.0.1-SNAPSHOT/MednetTestingFrameworkV1-0.0.1-SNAPSHOT.pom
channel stopped
Finished: UNSTABLE
Test stops after login steps that I have put in. There is nothing wrong with my Element as it locates without any problem in my Windows machine.
I searched on internet about the problem. In many sources I found out about Xvfb screen resolution is by default usually not large i suppose. So I am not able to
increase screen size
I tried many commands such as xvfb :99 -screen 0 1024*720 or 1440*900
But of no use it show invalid screen resolution.
Need help in solving like any important step that I may have missed out.
Thanks in advance

Resources